Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Very Excited.

I'm likely very close to the ideal user. I don't program for work but make CSV-consuming tools from Python here and there to work on giant exports of data when they get outside of Excel's built-in magic.

Most recently the exact task was to consume a Zoom user export, filter with RegEx, and transform the table for upload to Zoom as a CSV again, but with different fields. This would translate very well to Neptyne if it supported 70k rows.



This sounds like a great use case for a few lines of petl: https://petl.readthedocs.io/en/stable/

Petl makes it very ergonomic to write ETL scripts like what you’re describing. Take a look at the regex.search method and the fieldmap method.


(Neptyne cofounder here)

That does sound like a fitting use case. Technically speaking it should be OK with 70k rows, but we've got some optimizations that might need to go out before it performs well. Ideally, you should be limited only by the memory allocated to the Python kernel, but today there are some other limitations that get in the way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: