Introducing two new open sources Clojure UI libraries by Factor House.
HSX and RFX are drop-replacements for Reagent and Re-Frame, allowing us to migrate to React 19 while maintaining a familiar developer experience with Hiccup and similar data-driven event model.
My co-founder uses the phrase minimal-viable-company for maximum-viable-product.
We bootstrapped for 5 years to well over $1M+ ARR before recently closing a seed round[1], Clojure played a large part in our ability to deliver as a small team. Also in our general happiness as programmers, it is a nice language to work in.
We will grow our Clojure core product team over the next couple of years, but mostly the funding round is about balancing our business to keep up with our product delivery.
Clojure has been very good to me (I had 15 years on the JVM prior to moving to clj/cljs in 2013-ish). YMMV.
It is immensely satisfying to build something that you believe in and succeed in selling it. I founded my company in 2019 and my experiencing and motivations in bootstrapping mirror yours.
We are in year 4 (commercials) and 6 (product development) at Factor House[1].
We build enterprise tooling for streaming systems (Apache Kafka and Apache Flink mostly). Selling software to enterprise customers is different from B2C as the sales cycles can be endless and most likely your cashflow might be more lumpy, but at the end of the day it's a very powerful thing to be profitable and independent.
As @yevpats points out sometimes the bootstrapping story does miss some details, in our case we invested roughly $500k to get through the pre-commercial period, to achieve that we sold our house (my wife is also my co-founder). Not everyone can, or is mad enough, to commit resources at that early stage. To be honest we were quite mad.
Prior to starting product development with Factor House we ran a consultancy that delivered systems for enterprise customers based on Kafka, Storm, Cassandra, etc - so we had plenty of experience. We also had consultancy customers who were eager to use the pre-commercial versions of our product and provide feedback.
I also run a meetup[2] in my hometown that specialises in programming solutions with distributed systems.
Last year we took a small amount of funding from Lighter Capital (non-dilutive, fairly simple loan terms) to unlock some growth.
Bootstrapping is hard, but my interactions with VC left me with the impression that it's a low-information lottery for the benefit of those who already have capital.
It seemed clear that if we took funding we would rapidly lose control of our vision and we don't need to 100x our business to achieve our goals. I would rather focus on delivery for our users and avoid adopting manic ideas to pay off 99 failed lottery tickets.
An enterprise toolkit for Apache Kafka (and now another for Flink).
I spent years working in large enterprise orgs, a few more working with distributed systems. Along the way I picked up Clojure and by the power of greyskull managed to combine all those factors into a company. Now I work with a small team shipping tools for programmers. Good times.
Today we have users in 100+ countries, but it started off as something I needed for myself / my team when working on client projects.
If you Americans buy a house with an 8% mortgage today, can you remortgage in the future if/when the rate drops. Is the buy-out penalty of remortgaging somehow higher than just selling / repurchasing?
Do people get locked into higher mortgage rates for long periods of time that are uncompetitive is my question. Is there a significant downside? Is 30-year fixed normal in the states?
30-year fixed rates don't exist in Australia. You'll get a 5 year fixed rate from ~6% or so, that's about it.
The American mortgage market is very unique from the perspective that it has 10, 15 and 30 year fixed rate debt. There are generally no prepayment penalties and no balloon payment (each payment is the same amount even the last one). You can pay down extra any time you want and it reduces your principal appropriately.
The maturities and payment structures are quite generous compared to many other countries mortgage products. Of course there are shorter maturities and different types of adjustable rate mortgages but these are not popular (fallout from 2008 crisis and the general low interest rate environment).
Edit: there is also 40 year fixed products starting to be offered.
Government policy. The US has multiple quasi-government mortgage market makers (because just one is somehow not enough...) that define baseline terms for home mortgages, and everyone must play their tune. See Fannie Mae and Freddie Mac. Seriously. That's what they're named.
There is another one tangentially involved as well that I can't remember the name of.
Exactly. Fannie Mae and Freddie Mac buy about around 70 percent the mortgages issued by banks. They only by conforming mortgages, meaning conforming to their terms.
One can see there really isn't a laissez faire free market at work when it comes to housing in the US. The government is in deep and it's regulated out the ying yang.
Before real estate loan terms were exploitive interest only loans that the bank could call in any time. Worse they could demand payment in a fixed amount of gold. And when they foreclosed the owner lost his entire collateral.
First three years of the great depression was an orgy of foreclosures driven by bankers greed and panic. FDR closed the banks, seized all gold except for personal jewelry. The new deal introduced 30 year fixed rate mortgages to make sure the banks couldn't do that again. Loosening rules led to the 2008 crisis where they did it again. But the rules did still protect most.
A family member recently got a long term fixed low rate mortgage in Belgium and I’m curious about how different things are compared to the UK. UK mortgages are higher, shorter term.
Is the Belgian bank losing money compared to the UK one? Is there state intervention?
In the UK it's long been possible to get a kinda long term fixed rates - at least 10 years.
They just don't tend to sell very well - when interest rates are low [1], it's not particularly appealing to fix at 2.69% for 10 years when you could fix at 1.94% for 5 years or 1.25% for 2 years.
And coming off the back of two decades of rock bottom interest rates, a lot of people didn't anticipate that they'd be remortgaging at a >5% interest rate.
There is no state intervention. Depending on market conditions, a 30 years fixed can have a higher rate than 25 years.
It’s basically hedged with long term bonds (Belgian or European) + a profit margin for the bank + risk based on your profile (age, health, employment history, …)
I guess UK banks are just hedging with shorter term bonds compared to Belgian ones.
The big difference is that in most European countries I know of you are locked into that fixed rate for the duration of the loan and cannot re-finance or pay
it off early without getting hit with huge penalty fee, essentially equal to the lost interest payments the bank would be missing out on. In the US you can pay off and/or renegotiate early without those penalties.
Not true, depends on European bank. I did pay my mortgage which was originally 25 years IIRC in Czech republic fully, one time payment, with no extra fees associated to this. Whether bank was happy or wanted me to keep paying all those years is another story, but their contract specifically allowed it.
In my French mortgage, I have 25 year fixed part, no point paying down that one earlier since the fee would be the sum of all the fixed interest for 25 years (what you wrote). Then the other part is calculated every 3 months from EURIBOR (not that great now, just like elsewhere). This one I can pay partially or fully anytime without any fees.
My Swiss mortgage is completely different and unique beast (also split in 2 parts, one fixed 1 variable from Saron rate), nothing you can see anywhere else in the world IIRC. 20% cash downpayment as usually, then in next 15 years I need to pay off another 15% of the property, and rest is just interest payments. We'll never fully own the property, and its very disadvantageous tax-wise to own it(so nobody here does it if they can avoid it). Swiss invented an additional property tax (Imputed rental value) that is calculated from hypothetical rent you could extract from given property, and you are taxed also from this theoretical income, even if its your primary residence.
This sounds strange.
Banks typically hedge their fixed rate loan portfolio because there aren't many equivalent long-dated fixed-rate funding sources available to them.
If the US market is such that borrowers can repay early or renegotiate long-dated fixed-rate mortgages without penalties, the banks are practically guaranteed significant losses when fixed-rates decline.
Do US banks just charge higher spreads than European ones to compensate for this? That sounds undesirable, similar to tax loopholes: everyone pays more to compensate the enlightened few that actually take advantage of something that _everyone_ would want to do.
The US mortgage market is essentially backstopped by the US government. Banks can sell the fixed rate mortgages to a government backed bank at a guaranteed rate and so don't have to hold the interest rate risk on their books. The US government (both parties) has long believed that home ownership is important and have a lot of policies to encourage it, this is one of them.
You can repay early in the Netherlands as well. A friend of mine works for a major bank to hedge the risk of their mortgage portfolio. He mentioned once that the biggest risk for Dutch banks is not the risk of default, but risk of early repayment. This always surprised foreign investors when they did due diligence to invest in Dutch mortgages.
There are ways they use to hedge for this risk. I don't know if this is desirable, but that is probably the case in the US as well.
Spain is also similar. We recently locked in a 2.65% for 5 years but 15 years around 3% was also available. That 15 year came with early repayment penalties though.
*so long as your house appraised for high enough value to meet the debt to equity ratio. So if mortgage rates go down, but your home price crashes 10%, you’ll likely only be able to refinance by significantly paying down the mortgage balance.
Yes, you can refinance any time with nearly no cost. During covid years, a lot of homeowners refinanced multiple times, each time with a small bonus (under 3K) for refinancing.
What people don't understand though is interest payments are front-loaded. Most of the early payments will be almost all interest, and with frequent refinances most of them are paying interest all time time, extending mortgage by a few years. Most only think of cash flow and the payments appear lower, if you don't think about those extra years.
In the US mortgage market, this is called curtailment. For example, if you make one extra payment per year, you will shorten a 30 year fixed rate mortgage by about 4.5 years. Also, for most US mortgages, you are allowed to prepay 100% with no penalty. This allows for refinancing.
What I do not understand: Why don't more countries do this? Have large gov't financing companies that guarantee certain fixed rate mortgages? Overall, it is a huge win for the middle class home owner in the US.
>For example, if you make one extra payment per year, you will shorten a 30 year fixed rate mortgage by about 4.5 years.
This doesn't sound right. If I make 5 extra payments (either in 1 year or over 4 years), the mortgage will be shortened by 5 x 4.5 years = 22 years. I'm sure everyone would do this, 5 extra payments is super easy!
You misunderstood my original post. One extra payment per years means 13 mortgage payments, instead of 12. (Almost all US mortgages require monthly payments.) To be clear: This extra payment is strictly optional, but it used to demonstrate how curtailment can meaningfully reduce the life of your loan and the amount of interest paid.
At the very least, during periods when interest rates are significantly above your mortgage rate, such as now, you should put it in a money market account instead of in your mortgage. It's the same amount of risk, but it's liquid. Really, you could do a long term Treasury bond for the same reasoning (same risk, same liquidity).
- Yes, typically we can refinance whenever we like, _but_ it extends the mortgage for a 30 year term, along with additional direct immediate costs (plus human inertia). Unless interest rates were alarmingly high for your last go-round (ehem), you're directly incentivized and indirectly likely to not do so.
- I own properties in Canada (yay Commonwealth!). The notion of a 30-year fixed does not exist. One can get a 25-year amortization, but typically only with a 5-10 year guarantee for a fixed rate.
- As an American, Canadians are insane for buying into this system. Our system is so much more favorable to anyone with good enough credit to be approved for a loan it's literal comedy. Also our standards for approving someone for a loan seem to be lower (that said, I had no credit history in Canada when I started this adventure, so perhaps residents get a better deal).
- As a property investor, I'm happy to control for the cash I sink into my investments in interest versus the returns I get from rental revenue. Combining that with exchange rates and US interest rates versus Canadian, I <3 Canada.
- Fully variable interest mortgages are for suckers (and in that regard, I do have some regrets).
(bias: I <3 Canada regardless -- I'd live in Whistler, BC if circumstances allowed)
How come? Here in UK you just remortgage for the remaining term of your mortgage, if you have 14 years left you just remortgage for 14 years. Is that something that you have to to do in America, or just what most people choose to do?
In the US lenders generally only offer a few options for the lengths of fixed rate mortgages, with 15 and 30 years probably being the most common.
There is generally no prepayment penalty here, so if you want some length that isn't one of the standard ones you can just get a longer one and then pay some extra principle each month to pay it off over the timeframe you wanted.
It is amazing how much less time it takes to pay off a 30 year mortgage if you increase the payments 10%. The first good many years are paying mostly just the interest.
No, this poster is not correct. You CAN refinance for another 30-year note, or a 15-year note, or a 10-year note, or whatever the bank will let you refinance for and you're willing to sign up for.
All you're doing is taking out another loan and using that loan to pay off the original loan. So whatever terms you can get from the bank are fair game.
Also, there's nothing stopping you from paying the loan off early.
Refinancing has a significant cost though. The various fees and transaction costs add up to $10-20k+ in my experience. Is not at all free, unless the savings from the difference in interest rate exceeds the transaction cost.
Interesting. Here in UK there are usually literally no costs to remortgaging, we've remortgaged couple years ago and only cost was paying £500 for a solicitor to look throught he paperwork(which we convinced the bank to pay for in the end, so it cost us nothing overall).
Same here. There are no fees(usually) for taking a new mortgage either. A bank might charge you something for making the transfer, but it's like £50, completely insignificant.
These 30 year terms and favorable terms like no prepayment penalty or balloon payments are a result of government regulations and subsidies that were created after the Great Depression in the 1930s and were designed to 1.) prevent people from losing homes when interest rates increase, 2.) encourage homeownership by making mortgages easier and more rational for buyers.
There are other government regulations and subsidies that encourages and compensates lenders for participating in this market. The government gives access to low interest loans through the federal reserve banking system, created private entities (such as Freddie Mae) that purchase conforming mortgage back securities and requires buyers to pay for loan default insurance until they have a specific amount of equity in the home (which compensates the lender in the event of a foreclosure).
These policies and subsidies are the only reason this market exists and is able to be so highly beneficial for buyers and allow for such a long-term risk taking.
When you refinance, it’s just a new mortgage. You can do it purely for rate, or you can do a cash out refi (refinance at market rate for your house, pocket the equity you’ve built up as cash, but now your loan is bigger). The loan terms are like a regular mortgage, because they basically are. The originator of the mortgage often sells it on the secondary market.
Well yes, obviously - it's the same here. I just read what OP said as a requirement that you have to extend your mortgage by another 30 years. Here you just get a now mortgage of any length you want - if you fancy 9 years or 12 or 38 that's fine.
Extending isn't the right word. You're applying for a completely new one, often with a different bank. The common options are 15 year and 30 year terms. The new one pays off the old one and if it's structured you can even get extra cash out, though that may increase your interest rate.
In this case you would likely do a 15 year term loan. The previous comment is misleading or incorrect in stating that you have to do a new 30 year loan always when refinancing.
When I refinanced they let me pick a term of a 27 years, of course it was at the 30 year rate. There is no advantage to doing this versus just automatically overpaying the mortgage every month to tune in the target payoff date.
As an American, Canadians are insane for buying into this system.
As a European, the US mortgage system combined with very generous tax-deductible interest rules, is probably one of the most generous and property owner friendly system around. As a property (and mortgage) owner myself, I'm very envious of it.
That being said, had I been a renter in the US I would probably be very upset about how much tax payer money is going to support home owners.
There is a lot of economic research around the second order impact of these tax deductions. Do they, in fact, increase the cost of homes? There is _some_ evidence that says yes, so the tax deduction is offset by higher purchase price.
The standard deduction has been significantly increased in the US recently. Mortgage interest is often not worth itemizing unless you're in a high cost of living area or have a big house elsewhere.
> As an American, Canadians are insane for buying into this system.
In theory, there’s nothing wrong with some/tons of uncertainty, as long as people don’t play along and don’t overpay while hoping for the best. Buuuuut people are stupid and do just that.
30 year fixed and 15 year fixed are very common. There is also 5 year fixed then variable, but those burnt a lot of people in 2008 so they have a bit of a bad reputation with some. There are generally no early payment penalties. Refinancing is easy and considered worthwhile whenever the rates improve by 0.5 percent or more.
It's easy to refinance at a lower rate. You essentially just pay for and qualify for a new mortgage, the fact that it's a refinance and not a new house you're buying is mostly immaterial.
So you're out a few grand in fees, and if you somehow become less creditworthy it may not work.
When interest rates first spiked it seems like the prevailing wisdom was that they wouldn't stay high for long, so buyers should just swallow the higher monthly payment "for a year or two" then plan to refi.
Yes, you can refinance to a lower rate. It’s easy and people do it all the time. There is some cost overhead so you don’t do it everytime rates drop a tiny bit. But if they drop more than 1% it’s easily worth it.
> Yes, you can refinance to a lower rate. It’s easy and people do it all the time.
It's almost unbelievable how easy it can be.
I got a call from the company that held my mortgage asking why I hadn't responded to the refinance offer they had sent me. I told them I wasn't aware of any such offer. They said they had FedExed an offer to me a couple weeks earlier.
I went and looked on the front porch, and sure enough there was a thick FedEx package there. I hadn't noticed that because I used the back door as my main entry/exit door.
Inside was all the paperwork, prefilled, for a refinance with instructions that said all I had to do to accept was call them and tell them, and then they would send a notary to meet me at home or at my office with a copy of the documents for me to sign.
Prices are not likely to crash with the conditions we have in place currently. Prices crashed in 2008 because there was 10 to 15-year-long bipartisan push that "everyone needs to be a homeowner." This let banks write adjustable-rate loans to people who arguably never should have been touching a mortgage given their financial situation. And when interest rates reset, enough people started defaulting that the whole system collapsed. And that's without getting into the funny business of securitizing those bad mortgages and using them as investment vehicles.
These days, mortgages are generally very well-underwritten and only given to people who can afford to pay. High interest rates are going to put a cap on prices, but where we're at now is a supply crunch. 2008 wiped out the homebuilding industry and now there's a supply crunch with not enough houses for the amount of people who want to buy, which is driving up prices.
I love JQ so much we implemented a subset of JQ in Clojure so that our users could use it to munge/filter data in our product (JVM and browser based Kafka tooling). One of the most fun coding pieces I've done, though I am a bit odd and I love writing grammars (big shoutout to Instaparse![1]).
I learned through my implementation that JQ is a LISP-2[2] which surprised me as it didn't feel obvious from the grammar.
I can't stand jq. I realize this is an unpopular opinion, and our codebase at work has plenty of jq in the bash scripts, some of it even code that I wrote. I begrudgingly use it when it's the best option for me. But something about it rubs me the wrong way - I think it's the unintuitive query syntax and the need to search for every minute step of what I'm trying to do, and the frequency with which that leads to cryptic answers that I can only decipher if I am some sort of jq expert. But I have this instinctive reaction to all DSL languages that embed themselves into strings, like htmx and tailwind (both embedded in attribute string values). I realize some people like it, and it's a well-made piece of software, and I will even admit that sometimes there is no better choice. But I guess I just hate that it's necessary? I guess I could also admit it's the least-bad option, in the sense that it's a vast improvement over various sed/awk/cut monstrosities when it comes to parsing JSON in bash. Certainly once you find the right incantation, it's perfect - it transforms some raw stdin into parsed JSON that you can manipulate into exactly what you need. But for me, it ranks right next to regex in terms of "things I (don't) want to see in my code." I hate that the jq command is always some indecipherable string in the middle of the script. The only real alternative I've ever used is piping to a Python program that I define inline in a heredoc, but that ends up being at least as nasty as the JQ script.
> I hate that the jq command is always some indecipherable string in the middle of the script
It might be worthwhile to just learn how jq works. At the end of the day, you need to learn some language to parse json. I hate DSLs too, but I cannot think of anything as useful and concise as jq.
> but that ends up being at least as nasty as the JQ script
That's exaxtly why jq is so nice. Nice alternatives just don't exist
> That's exaxtly why jq is so nice. Nice alternatives just don't exist
Write a simple Python script, parse JSON into native objects, manipulate those objects as desired with standard Python code, then serialize back into JSON if necessary. Voila, you have a readable, maintainable, straightforward solution, and the only dependency (the Python interpreter) is already preinstalled on almost every modern system.
Sure, you may need a few more lines of code than what would be possible with a tailor-made DSL like jq, but this isn't code golf. Good code targets humans, not "least possible number of bytes, arranged in the cleverest possible way".
The simple existence of DSL tools like jq is the testament to the fact that people don't want to go to a generic language to solve every kind of problem.
I'm also convinced that a big subset of "use generic language for everything" do it because they want to use their shiny hammer on that nail as well.
> Sure, you may need a few more lines of code than ...
jQ integrates very nicely into bash script. Especially in between pipes a short&simple jq-snippet can work wonders for readability of the overall script.
On the other hand, if the bash script becomes too complex it may be a good idea to replace the entire bash script with python (instead of just the json-parsing-part)
> ... if the reader happens to be familiar with the niche language "jq".
Eh. Linux/Unix has always had an affinity for DSLs and mini-languages. If you're willing to work with bash, sed, awk, perl, lex, yacc, bc/dc etc. jq doesn't seem like it should cause too much consternation.
> Especially in between pipes a short&simple jq-snippet
Many of them are not short and simple though. And each time you do a some transformation, you pretty much need to go in/out of jq at each step of it want to make some decisions or get multiple types of results without processing the original multiple times.
The point in my career at which I used jq the most was when I was doing a lot of work with Elasticsearch doing exploratory work on indexed data and search results. Doing things such as trying to figure out what sort of values `key` might have, grabbing ids returned, etc.
Second to this, I've mostly used jq to look at OpenAPI/swagger files, again just doing one-off tasks, such as listing all api routes, listing similarly named schemas, etc.
From what I've seen in the companies I've worked for, this is fairly consistent, but naturally I can't speak for everyone's use-cases. At the end of the day, I don't think most people use jq in places where readable or maintainable would be most appropriate.
Yea except the python solution is probably going to be several hundred lines, instead of a few.
Python is often not installed in server environments unless it's a runtime environment for Python.
Want to use a non standard library? Now your coworkers are suddenly in Python dependency hell. Better hope anyone else that wants to use this is either familiar with the ecosystem, or just happens to have an identical runtime environment as you.
Or someone could just curl/apt/dnf a jq binary to use your 3 line query, instead of maintaining all of this + 200 lines of Python.
I got to jq for the same reason I go to regular expressions. If you tell me this is too complex
(?:[A-Z][a-z]+_?(\d+))
Then I don't know what to tell you. Do you think that's too complex and should be a python script too? I don't think so. It looks complex, but if you just learn it, it's easier than a 'simple' script to do the same thing.
I'd argue it's good code if you don't have to sift through lines of boilerplate to do something so trivial in jq or regex syntax.
I do lots of exploratory work in various structure data, in my case often debugging media filea via https://github.com/wader/fq, which mean doing lots of use-once-queries on the command line or REPL. In those cases jq line-friendly and composable syntax and generators really shine.
> Something not having alternatives doesn't make it necessarily nice
Of course not, but compared to every alternative today, jq is eons better than everything else. It's conciseness, ease of use, ease of learning all make it awesome. So as of right now, it is the nicest thing to use by far.
Personally though, I don't think I do wish for better. Jq is missing nothing that I want.
I really like jq, but I think there is at least one nice alternative to it: jet [1].
It is also a single executable, written in clojure and fast. Among other niceties, you don't have to learn any DSL in this case -- at least not if you already know clojure!
I hadn't seen this before. At a quick glance, the syntax looks fine. Though I don't know what command line utility I'd need to use it. It makes me wonder how hard a translator from jq syntax to jsonpath would be... Then we could have our cake and eat it too.
In my opinion (potentially nor popular) JQ has this appeal to nerds the same way that stuff like Perl does. I say this as someone who did Perl for 20years but now prefers python or JS…
For many people regexes are as bad as the jq queries… and vice versa. I would not recommend to write python script instead of regexp, but indeed it may work the same for small data and be more readable.
I love régex and been mastering it since 1999. So much that in 2013 I used it in production to parse binary protocol with dynamic sized fields. I believe the project is still talking 10k plus devices. Google must’ve just released protocol buffers… I would love to finally see regexes which can work over custom flow of objects and also on trees.
I also loved XPath which is very powerful and very comprehensible, then there is CSS1/2/3 which are again for queries to structures tree like data.
The prospect of now learning jq does not appeal me that much even though I appreciate its ingenuity. I may recommend it to dev/ops colleagues now and then, but for me this syntax is a lot of additional cognitive pressure which does not necessarily pay up. Of course if there is large amount of JSON data - it is the Swiss knife.
But nowadays I’ll likely use some LLm to generate the jq query for me. Also would joke with my bash-diehard colleagues who would love one more DSL…
For simple things like navigating down one key, or one array entry, I know by heart, and it's incredibly useful. But anything more complicated, and I'm too lazy to lookup the documentation.
jq will fall into the bucket along with sed/awk of "tools I once wished to become an expert on, but will never do so because ChatGPT came along".
Would also put regex into that bucket, but they're so ubiquitous that I've already learned regexes. I wonder if the new wave of coders learning coding via ChatGPT will think of regexes the same way I think of sed/awk.
I think these very terse languages are precisely the ones you shouldn't unleash ChatGPT on. It needs to be really exact and if it is wrong, you can easily end up with something that is an infinite loop or takes exponential time with respect to the input.
My way of using ChatGPT is just to ask it to give me some complicated sed/awk command, and then I can usually understand easily if the command is correct, or easily look it up. So it is very good for learning.
many problems seem to have the property that it's easier to verify a solution than to come up with one. If someone provides a filled-out sudoku puzzle, it's relatively straightforward to check if they've followed the rules and completed it correctly. However, actually solving the puzzle from scratch requires a different kind of thinking and might take more time.
I've also found that learning by "ask ChatGPT, paste, verify" is so much faster and more fun than banging my head against concrete to deeply read documentation to reason about something new.
I've started doing this for new programming languages and frameworks as well, and it shortens the learning curve from months down to days.
Agree - by the time I need more than grep and reach for json parsing, it’s already complicated enough for a Python script. stdin pipped to json.loads ain’t that bad.
Def. seen jq thrown into sed/awk scripts where a readable programming language was the right move. People spend hrs finding the right syntax to these things ~ not always well spent.
I've got similar feelings about it and recently I started experimenting with writing scripts in Nushell rather than bash + jq. I get the json object as a proper type in the script, get reasonable operations available on it and don't have to think of weird escaping for either the contents or the jq script. It cuts down the size my scripts by about a half and I'm very happy with the results.
Yeah, Python is like 10-20x the number of lines required to do the same thing as jq (especially with the boilerplate of consuming stdin), but that's also why it's more readable. But generally I agree - I would choose jq over some weird bash/python hybrid most of the time. I just wish it was more immediately readable.
Simple jq programs are easy to read because simple jq programs are just path expressions, and the jq language is optimized to make path expressions easy to read. Path expressions like
.[].commit | select(.author == "Tom Hudson")
which basically says "find all commits by Tom Hudson" in the input.
`.[]` iterates all the values in its input (whether the input be an array or an object). `.commit` gets the value of the "commit" key in the input object. You concatenate path expressions with `|`, and array/object index expressions you can just concatenate w/o `|`, so `.[]` and `.commit` can be `.[] | .commit` and also `.[].commit`. Calls to functions like `select()` whose bodies are path expressions are.. also path expressions.
Perhaps the most brilliant thing about jq is that you can assign to arbitrarily complex path expressions, so you can:
The syntax is strange probably because of this trying to make path expressions so trivial and readable.
jq programs get hard to read mainly when you go beyond path expressions, especially when you start doing reductions. The problem is that it resembles point free programming in Haskell, which is really not for everyone.
The other thing is that jq is very much a functional programming language, and that takes getting used to.
Also, here’s something that seems not widely appreciated: You can write super clever unreadable one-long-line jq programs embedded in bash scripts (I hear you on the point-free thing), or you can write jq programs that live in their own files, with multiple lines, indentation, comments, and intermediate assignments to variables with readable names. I recommend the latter!
This also won't work since it'll crash on missing fields. e.get("commit", {}).get("author", "") maybe (ignoring the corner case of non-list top level object).
This is a non-problem solved by the jq example. Clearly nobody sane writes (or consumes) APIs which sometimes produce array of object, sometimes produce singular objects of the same shape... Or maybe I'm spoiled from using typed languages and cannot see the ingenuity of the python/javascript/other-untyped-hyped-lang api authors that it solves?
> Clearly nobody sane writes (or consumes) APIs which sometimes produce array of object, sometimes produce singular objects of the same shape...
Has nothing to do with arrays, it has to do with the fact that Python dicts with string indexes and Python objects with properties are different things, unlike JS where member and index access are just different ways of accessing object properties.
> Or maybe I'm spoiled from using typed languages and cannot see the ingenuity of the python/javascript/other-untyped-hyped-lang api authors that it solves?
This isn't an untyped thing, this is a JavaScript (and thus JSON) and Python have type systems (even if they usually don't statically declare them) and those type systems and thus the syntax around objects are different between the two.
Oops, yep totally. Even more futzy! Think if I was doing this a lot I'd totally pull out one of those "dict wrappers that allow for attr-based access" that lots of projects end up writing for whatever reason
I wish it had won over jq because JMESPath is a spec with multiple implementations and a test suite where jq is... well jq and languages have bindings not independent implementations.
> I wish it had won over jq because JMESPath is a spec with multiple implementations and a test suite where jq is... well jq and languages have bindings not independent implementations.
jq has multiple implementations too! In Go, Rust, Java, and... in jq itself.
> jackson-jq aims to be a compatible jq implementation. However, not every feature is available; some are intentionally omitted because thay are not relevant as a Java library; some may be incomplete, have bugs or are yet to be implemented.
Where JMESPath has fully compliant 1st party implementations in Python, Go, Lua, JS, PHP, Ruby, and Rust and fully compliant 3rd party implementations in C++, Java, .NET, Elixer, and TS.
Having a spec and a test suite means that a all valid JMESPath programs will work and work the same anywhere you use it. I think jq could get there but it doesn't seem to be the project's priority.
I've found Ruby much nicer for writing dirty parsing logic like this in a "real" language, it lets you be more terse and "DRY" than Python. Which in bigger software projects doesn't hurt me as much but when I'm primarily trying to write something that otherwise would be well handled by SQL or JQ I found Ruby the better middleground for me.
"Indecipherable string" to me means you likely don't understand the language or how it works.
The language itself works very well for what it needs to do.
It does not work the same way as something like parsing an object and manipulating it in python.
It is a query language. You are building up a result not manipulating objects.
Definitely unintuitive if you are coming from a programming language.
Once learned it makes a lot more sense and is even preferable depending on your needs.
> it's the unintuitive query syntax and the need to search for every minute step
I love jq as a power tool and have the same challenges. I think the best path would have been for JavaScript to adopt something akin to JsonPath, although I more often reach to jq out of familiarity than use it in kubectl.
I hadn't looked into JsonPath as a standard, and on closer inspection, it looks to be stalled out. Maybe I'll keep piping kubectl get <resource> -ojson | jq '<what I'm looking for>'.
The responses to this comment seem to miss a vital point that the comment is making: languages executed within a different primary language are usually opaque to the tools in use. Those tools are usually aimed purely at the primary language, not any secondary languages used within it. Tools for the secondary language are now much harder to use because they (usually) have to be invoked and used via the primary language.
If I’m working on a Python script which has some jq embedded in it, then these problems probably exist:
- My editor will only syntax colour the Python, and treat jq code as a uniform string with no structure
- My linter will only consider Python problems, not jq problems
- My compiler, which is able to show parsing errors at compile time rather than runtime, will not give me any parsing errors for jq until execution hits it (yes, Python has a compilation step)
- jq error messages that show a line number will give me a relative line number for the jq code, rather than the real line number for where that code lives in the Python file
- My debugger will only let me pause and inspect Python, and treat the jq execution as a black box of I/O
I’m discussing this as a jq problem, but this happens far more commonly with SQL inside any host language. No wonder ORMs are so popular: their value isn’t just about hiding/abstracting SQL, it’s about wrangling SQL as a secondary language inside a different primary one.
- Microsoft’s LINQ for C#
- Webdev-focused IDEs which aim to correctly handle HTML and Javascript inside server-side languages (e.g. PHP)
jq is way too much for what I need. I hacked together a filter in C to reformat JSON and I like it better than every JSON library/utility I have tried. For simple reformatting, jq is slow and brittle by comparison. Also, I can extract JSON from web pages and other mixed input. All the JSON utilities I have tried expect perfectly-formed JSON and nothing else.
I also find VisiData is useful for adhoc exploring of JSON data. You can also use it to explore multiple other formats. I find it really helpful, plus it gives that little burst of adrenaline from its responsive TUI, similar to fx and jless mentioned.
For my toolbox I include jq, gron, miller, VisiData, in addition to classics like sed, awk, and perl.
I understand where you're coming from and often feel the same, but I'm also afraid that this is a clear case of inherent complexity: querying JSON is just a complex problem and requires a complex query language, regardless of how well a piece of software implementing it is designed. The same is valid for regexes of course.
The main problem is treating one-thing and many-things the same way. Its not a great PL design choice (and its why we can't have slurp as a filter). If streams (not arrays) were also first-class, we would easily have `smap`, `sselect` etc and the code would look like a functional programming language where | is the pipeline operator.
Otherwise, its fine if you try to keep the thought "everything is a 'filter' or a composition of filters, and a 'filter' is a function that either maps, flatMaps or filters things" in your mind at all times
`jq` and `GNU Parallel` share a world in my brain where I know they're wonderful tools, but I spend more time grokking the syntax of each one as rarely as I need either, than just writing a bash/sed/awk/perl, ruby, or python script to do what I need.
`jq` solves the problem of JSON in legacy shells. But I think the real problem is that the world is stuck using Bash rather than a more modern shell that can parse JSON (as well as other data structures) as natively as raw byte streams.
The problem with Bash is to do anything remotely sophisticated you end up embedding DSLs (a bit of awk, some sed, a sprinkle of jq, and so on and so forth) into something that is itself already a DSL (ie Bash).
Whereas a few more modern shells have awk, sed and jq capabilities baked into the shell language itself. So you don’t need to mentally jump hoops every time you need to parse a different type of structured data.
It’s a bit like how you wouldn’t run an embedded Javascript or Perl engine inside your C#, Java or Go code base just to parse a JSON file. Instead you’d use your languages native JSON parsing tools and control structures to query that JSON file.
Likewise, the only reason jq exists is because Bash is useless and parsing anything beyond lists of bytes. If Bash supported JSON natively, like Powershell does (and to be clear, I’m not a fan of Powershell but for whole different reasons) then there would be literally no need for jq.
Community refuses to admit that powershell is much better alternative to bash/python combo and here we are stuck in this mess.CI/CD scripts spaguetti is usually the most unstable piece of code in a company.
> Community refuses to admit that powershell is much better alternative to bash/python combo
Because its not.
Powershell is very nice as a glue language for .NET components, and its better as a general purpose shell/scripting language than the old DOS-inspired Windows Command Prompt, for sure.
I greatly dislike case-insensitivity. It's a source of many problems for users and implementors.
For implementors case-insensitivity means the need for full Unicode support is urgent, while Unicode canonical equivalence does not often make the need for full Unicode support urgent. In practice one often sees case-insensitivity for ASCII, and later when full Unicode support is added you either have to have a backwards compatibility break or new functions/operators/whatever to support Unicode case insensitivity.
For users case-insensitivity can be surprising.
For code reviewers having to constantly be on the lookup for accidental symbol aliasing via case insensitivity is a real pain.
Why does it have to be bash+python? I'm finding myself using node.js scripts glued together by bash ones these days unless I'm working on a lot of data. Doing that means you can work with json natively.
`json.loads` in Python exists, and Python does the intuitive thing when you do `{"a": 1} == {"a": 1}`, at least for most purposes (you want the other option? `is` is right there!). Stuff like argparse is not the easiest thing to use but it's in the standard library and relatively easy to use as well.
Not going to outright say that node.js scripts are the worst thing ever (they're not), but out-of-the-box Python is totally underrated (except on MacOS where `urllib` fails with some opaque errors untill you run some random script to deal with certs)
Assuming <data> will be a key-value-object aka dict, it would be something like this:
import json
data = json.loads('<data>')
bar = None
if foo:=data.get('foo'):
bar = foo[0].bar
print(bar)
If you can't be sure to get a dict, another type-check would be necessary. If you read from a file or file-like-object (like sys.stdin), json.load should be used.
I love nodejs, it's my go-to language for server side stuff.
Even with that bias though, I have to admit that it's awful for typical command line script stuff.
Dealing with async and streams and stuff for parsing csv files is miserable (I just wrote some stuff to parse and process hundreds of gigs of files in node, and it wasn't fun).
Python is the right tool for that job IMHO.
Also, weirdly, maybe golang? I just came across this [1] and it has one of my eyebrows cocked.
Any not-designed-specifically-for-shell language will suck for shell, more or less. Ruby, python, node, whatever, they all have the same problem - you write stuff too much and care about stuff you shouldn't care while in shell.
You're probably right. I just wish there was an easier way to handle json on the command line that didn't turn into its own dsl. The golang scripting seems interesting, might be what motivates me to learn the language.
Apparently, the old community need to literary die with their old habits for new to take place. There is no amount of good argumentation that can be fruitful here. And there is tone of it, pwsh is simply on another level then existing combos.
The fact that you have to learn a new language to parse JSON is frankly insulting. If you've gotten to the point you're parsing JSON with a shell script, you should've switched to a real language a week ago.
Some people are weird and awe at the ellegance of piping 8 obscure commands, but if I'm given this shit and have to keep it working, I'm rewriting it on the spot.
Are you rewriting it in the first language you learned?
Sometimes less general tools are nice. If they fit the problem space well, they can be very expressive without feeling unwieldy. And in some contexts reducing the power/expressivity is actually a good thing (e.g. not using a C interpreter to make your program and your config file use the same 'language')
I also just add a JQ parser/grammar to the online LALR(1)/FLEX grammar editor/tester at https://mingodad.github.io/parsertl-playground/playground/ select "Jq parser (partially working)" from examples then click "Parse" to see a parser tree of the source in "Input source".
A related question for you and anyone else into this kind of tooling: if you had to automate some structural edits across a codebase that contains a wide range of popular languages (say: C++, C#, Java, Ruby, Python), and you had to do it with a single tool, which tool would you use?
jq is great for letting users munge their data; we do something similar letting users provision an incoming webhook endpoint, send us arbitrary json data, and set up mappings to do useful things with their data, along with regression tests, monitoring, etc. jq makes the majority of cases straightforward (basically json dot notation) and the long tail possible.
I run a bootstrapped software company, not at the $100M scale.
My interest piqued I clicked, had a look at Aha! (having never heard of it before) and immediately thought - hey I could use this. You got me, ka pai, haha.
Any insight on how to navigate inflexion points that seemingly require more capital that cashflow allows? Stay true and spend less? Number 8 wire might not be the same solution today as it once was, that's my worry.
In a business that is more capital intensive than SaaS bootstrapping is likely to be a lot harder. The beauty of SaaS is that expenses scale smoothly with revenue so you never need large capital infusions.
It may be tempting to think "if only I had enough money for a big advertising campaign". But if you are not seeing that small investments in advertising result in corresponding small increases in revenue, then it is very unlikely that a big investment in advertising will result in a big return.
When something is working well it is blindingly obvious. If you are having to look hard for signs of positivity then it is not working. Lots of money can only serve to make things appear successful for a limited period of time.
Hi @richieartoul, does warpstream support the standard Kafka Admin API?
We build an admin console / dev tooling for Kafka (https://kpow.io) that supports Kafka 1.0+ including Redpanda due to their fairly strict adherence to those API.
Warpstream seems like a cool idea, I'd like to see what happens if we plug Kpow on top of it, if that's possible.
That said we’d be stoked to get this working as an additional tool for people. Do you want to shoot me an email or join our slack so we can discuss further? I can probably prioritize whatever protocol features were missing to get it working.
HSX and RFX are drop-replacements for Reagent and Re-Frame, allowing us to migrate to React 19 while maintaining a familiar developer experience with Hiccup and similar data-driven event model.