8000 Tags · buoyant-data/hotdog · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Tags: buoyant-data/hotdog

Tags

v1.2.4

Toggle v1.2.4's commit message
Remove hard-coded internal amount to buffer

This allows users to specify a flush interval, and as long as they have
enough memory, it will buffer as much as possible internally!

v1.2.3

Toggle v1.2.3's commit message
Correctly handle writing larger batch sizes for the parquet sink

Inside Apache Arrow (Rust) there's a 1024 row batch size default which
was clipping the amount of data being decoded when flushing massive
parquet buffers

v1.2.2

Toggle v1.2.2's commit message
Remove the bounded channel for queueing into the parquet sink

v1.2.1

Toggle v1.2.1's commit message
Make failing to infer the schema non-fatal

if a schem cannot be inferred it's important to log the error, but the
process crashing entirely is not idea

v1.2.0

Toggle v1.2.0's commit message
Properly flush and exit on ctrl-c

Fixes #60

v1.1.0

Toggle v1.1.0's commit message
Add support for defining schemas to be used by the sinks

Right now the Kafka sink does not support the use of the defined
schemas, but these allow for defining valid/acceptable schemas up front
for data written to specific topics.

What this will _not_ do however is any form of type coercion! Make sure
the schemas are the right types for the data coming in!

v1.0.2

Toggle v1.0.2's commit message
Lowercase options before they're passed through to object store

This also introduces the S3_OUTPUT_URL environment variable

v1.0.1

Toggle v1.0.1's commit message
Minor updates with performance improvements

v1.0.0

Toggle v1.0.0's commit message
Add parquet support, yay!

v0.5.1

Toggle v0.5.1's commit message
Denote that simd_json is unsafe

0