In case of problems with the automatic setup, you can prepare the environment manually:
In case of problems with the automatic setup, you can prepare the environment manually:
1. Download [Apache Flink 1.14.0](https://archive.apache.org/dist/flink/flink-1.14.0/flink-1.14.0-bin-scala_2.11.tgz) and decompress in `repo_dir`
1. Download [Apache Flink 1.14.0](https://archive.apache.org/dist/flink/flink-1.14.0/flink-1.14.0-bin-scala_2.11.tgz) and decompress in `repo_dir`
3. Get the input datasets, either using `./get-datasets.sh` or by manually downloading from [here](https://drive.google.com/u/0/uc?id=1uNOUlCoa9CfH7WCxe3nSsCwQVVjf9teB) and decompressing in `repo_dir/data/input`
3. Get the input datasets, either using `./get-datasets.sh` or by manually downloading from [here](https://zenodo.org/records/14007044) and decompressing in `repo_dir/data/input`
4. Compile the two experiment jars, from repo dir: `mvn -f helper_pom.xml clean package && mv target/helper*.jar jars; mvn clean package; mvn install`
4. Compile the two experiment jars, from repo dir: `mvn -f helper_pom.xml clean package && mv target/helper*.jar jars; mvn clean package; mvn install`
2. Run `./init-configs.sh` to use the Flink configurations used in our experiments
2. Run `./init-configs.sh` to use the Flink configurations used in our experiments