My personalized news and blog aggregator
Go to file
2023-06-01 00:59:46 -04:00
migrations Make titles optional on feeds and entries 2023-05-17 23:10:23 -04:00
src Add (semi) live reloading for debug server 2023-06-01 00:59:46 -04:00
.gitignore Add very basic crawl job 2023-05-09 23:55:42 -04:00
build.rs Initial commit with basic axum and sqlx API 2023-05-07 17:41:45 -04:00
Cargo.lock Add (semi) live reloading for debug server 2023-06-01 00:59:46 -04:00
Cargo.toml Add (semi) live reloading for debug server 2023-06-01 00:59:46 -04:00
drop_all.psql Make titles optional on feeds and entries 2023-05-17 23:10:23 -04:00
README.md Add README 2023-05-10 00:16:30 -04:00

crawlnicle

My personalized news and blog aggregator. Taking back control over the algorithm. Pining for the days of Google Reader. An excuse to write more Rust.

Development Install

  1. Install and run postgres.
  2. Create postgres user and database:
createuser crawlnicle
createdb crawlnicle
sudo -u postgres -i psql
postgres=# ALTER DATABASE crawlnicle OWNER TO crawlnicle;
\password crawlnicle

# Or, on Windows in PowerShell:

& 'C:\Program Files\PostgreSQL\13\bin\createuser.exe' -U postgres crawlnicle
& 'C:\Program Files\PostgreSQL\13\bin\createdb.exe' -U postgres crawlnicle
& 'C:\Program Files\PostgreSQL\13\bin\psql.exe' -U postgres
postgres=# ALTER DATABASE crawlnicle OWNER TO crawlnicle;
\password crawlnicle
  1. Save password somewhere safe and then and add a .env file to the project directory with the contents:
HOST=127.0.0.1
PORT=3000
DATABASE_URL=postgresql://crawlnicle:<password>@localhost/crawlnicle
DATABASE_MAX_CONNECTIONS=5
RUST_LOG=crawlnicle=debug,cli=debug,lib=debug,tower_http=debug,sqlx=debug
  1. Install sqlx_cli with cargo install sqlx-cli --no-default-features --features native-tls,postgres
  2. Run sqlx migrate run which will run all the database migrations.
  3. Build the release binary by running cargo build --release.
  4. Run ./target/build/crawlnicle to start the server.

Using the CLI

This project also comes with a CLI binary which allows you to manipulate the database directly without needing to go through the REST API server. Run ./target/build/cli --help to see all of the available commands.

Running jobs

To periodically fetch new items from all of the feeds execute the cli crawl command in a cronjob.