-
Notifications
You must be signed in to change notification settings - Fork 514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Snowflake Support #5500
Comments
For CSV, we've actually relied on the katan library. It looks to be possible to implement the IO simply with a We'd be very happy to see a new contribution from your side! |
The parser is implemented on the Beam side, using opencsv: only the downstream mapper can be specified. So it would need a PR on Beam to allow to specify another parser. |
I meant to leverage the decoding part of katan, with smth like val thingMapper = new SnowflakeIO.CsvMapper[Thing] {
override def mapRow(parts: Array[String]): Thing = implicitly[RowDecoder[Thing]].unsafeDecode(parts.toSeq)
} |
@RustedBones opened #5502 |
Hello here,
Apache Beam has Snowflake support, so it's possible to use it with:
However a proper scio integration would be great. I suppose derivating
Thing
toSnowflakeIO.CsvMapper
would need some first a PR in magnolify?I can work on it.
The text was updated successfully, but these errors were encountered: