My personalized news and blog aggregator
Tyler Hallada
76cc87631f
Frontend is built with Bun. It uses Stimulus to progressively enhance the server-built HTML. Currently, it only replaces UTC timestamps from the server with the time in the browser's timezone. |
||
---|---|---|
frontend | ||
migrations | ||
src | ||
static | ||
.gitignore | ||
build.rs | ||
Cargo.lock | ||
Cargo.toml | ||
drop_all.sql | ||
justfile | ||
README.md |
crawlnicle
My personalized news and blog aggregator. Taking back control over the algorithm. Pining for the days of Google Reader. An excuse to write more Rust.
Development Install
- Install and run postgres.
- Create postgres user and database:
createuser crawlnicle
createdb crawlnicle
sudo -u postgres -i psql
postgres=# ALTER DATABASE crawlnicle OWNER TO crawlnicle;
\password crawlnicle
# Or, on Windows in PowerShell:
& 'C:\Program Files\PostgreSQL\13\bin\createuser.exe' -U postgres crawlnicle
& 'C:\Program Files\PostgreSQL\13\bin\createdb.exe' -U postgres crawlnicle
& 'C:\Program Files\PostgreSQL\13\bin\psql.exe' -U postgres
postgres=# ALTER DATABASE crawlnicle OWNER TO crawlnicle;
\password crawlnicle
- Save password somewhere safe and then and add a
.env
file to the project directory with the contents:
RUST_LOG=crawlnicle=debug,cli=debug,lib=debug,tower_http=debug,sqlx=debug
HOST=127.0.0.1
PORT=3000
DATABASE_URL=postgresql://crawlnicle:<password>@localhost/crawlnicle
DATABASE_MAX_CONNECTIONS=5
TITLE=crawlnicle
MAX_MEM_LOG_SIZE=1000000
- Install
sqlx_cli
withcargo install sqlx-cli --no-default-features --features native-tls,postgres
- Run
sqlx migrate run
which will run all the database migrations. - Build the release binary by running
cargo build --release
. - Run
./target/build/crawlnicle
to start the server.
Using the CLI
This project also comes with a CLI binary which allows you to manipulate the
database directly without needing to go through the REST API server. Run
./target/build/cli --help
to see all of the available commands.
Running jobs
To periodically fetch new items from all of the feeds execute the cli crawl
command in a cronjob.