Flink Dashboard
curl localhost:8081
Flink REST API is exposed here to see the running jobs:
curl localhost:8081/jobs
curl http://localhost:8081/jobs/<job_id>/
curl http://localhost:8081/jobs/a137f1319905c1a11a1d92593b3241ed/
Python is a second class citizen. I recommend going Java.
Go to the jobmanager and run ./bin/flink run -py /opt/examples/table/1-word_count.py
You can add configuration settings here
Depending on your execution mode, make sure to update the jars on the Job Manager and the Task Executor.
Which scala versions? There’s scala 2.11 and 2.12, you can check under your jars to see what’s currentlly installed.
/opt/flink/lib
shoudl have jars like:
flink-csv-1.13.0.jar flink-shaded-zookeeper-3.4.14.jar flink-table_2.11-1.13.0.jar log4j-core-2.12.1.jar flink-dist_2.11-1.13.0.jar flink-sql-connector-kafka_2.11-1.13.0.jar log4j-1.2-api-2.12.1.jar log4j-slf4j-impl-2.12.1.jar flink-json-1.13.0.jar flink-table-blink_2.11-1.13.0.jar log4j-api-2.12.1.jar
/opt/flink/opt/
should have jars like:
flink-azure-fs-hadoop-1.13.0.jar flink-s3-fs-hadoop-1.13.0.jar flink-cep-scala_2.11-1.13.0.jar flink-s3-fs-presto-1.13.0.jar flink-cep_2.11-1.13.0.jar flink-shaded-netty-tcnative-dynamic-2.0.30.Final-13.0.jar flink-gelly-scala_2.11-1.13.0.jar flink-shaded-zookeeper-3.5.6.jar flink-gelly_2.11-1.13.0.jar flink-sql-client_2.11-1.13.0.jar flink-oss-fs-hadoop-1.13.0.jar flink-state-processor-api_2.11-1.13.0.jar flink-python_2.11-1.13.0.jar python flink-queryable-state-runtime_2.11-1.13.0.jar
Plugins reside in their own folders and can consist of several jars. The names of the plugin folders are arbitrary.
Each plugin is loaded through its own classloader and completely isolated from any other plugin, that way the plugin
s3-fs-hadoop
and flink-azure-fs-hadoop
can depend on different conflicting library versions. There’s no need to
relocate any class during the creation of fat jars (shading).
What are examples of plugins?
Metric Reporters
File Systems are all pluggable (can and should be used as plugins). In order to use a pluggable file system,
copy the corresponding JAR from the opt
directory to a directory under plugins
directory, e.g.
mkdir ./plugins/s3-fs-hadoop
cp ./opt/flink-s3-fs-hadoop-1.16-SNAPSHOT.jar ./plugins/s3-fs-hadoop/
https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/filesystems/plugins/
/opt/flink/plugins
should have files like:
README.txt metrics-datadog metrics-influx metrics-prometheus metrics-statsd external-resource-gpu metrics-graphite metrics-jmx metrics-slf4j
mkdir flink-s3-fs-hadoop mv flink-s3-fs-hadoop-1.13.0.jar flink-s3-fs-hadoop cd flink-s3-fs-hadoop && unzip flink-s3-fs-hadoop
E.g. looking at an unzipped flink-s3-fs-hadoop.jar
root@jobmanager:/opt/flink/plugins/flink-s3-fs-hadoop# ls META-INF common-version-info.properties flink-s3-fs-hadoop-1.13.0.jar org.apache.hadoop.application-classloader.properties com core-default.xml org software
Service Classes are discovered through the java.util.ServiceLoader
so make sure the service definitions
are in META-INF/services
during shading.