I'll discuss this a bit during my talk at the dev summit.
The short answer is no.
The long answer is yes, but only if you create the model in Python, export it, and then feed training data in other languages. There are some people doing exactly that.
Long term, I'd like to give all languages equal footing, but there's quite a bit of work left.
Forgive my ignorance, but why is it that it is Python-only?
Does Python have intrinsic qualities that other languages don't possess or is it that the huge initial investment in creating TensorFlow was based on Python and duplicating that effort somewhere else would require too much work?
Traditionally, most neural network architectures have been implemented in C/C++ - for performance reasons. But ML researchers are not hackers, for the most part, and Python has the lowest impedence mismatch for interfacing with C/C++ of all the major languages. Julia was popular for a bit, but now Python is dominant. Programs tend to be very small, and not modular - so static type checking is less important than it would be in picking up errors in larger systems.
It's not just the lowest impedance mismatch, but it's also a framework coming out of google, where python and Java were really the only two language choices for a high level interface, and of the two python is the clear winner in prototyping / scientific community acceptance. I think it's because of the ease in experimentation and expressiveness of the language.
TensorFlow comes with an easy-to-use Python interface and no-nonsense interfaces in other languages to build and execute computational graphs. Write stand-alone TensorFlow Python, C++, Java, or Go programs, or try things out in an interactive TensorFlow iPython notebook where you can keep notes, code, and visualizations logically grouped. This is just the start though — we're hoping to entice you to contribute interfaces to your favorite language — be it Lua, JavaScript, or R.
Yes, though last I remember reading about this the symbolic differentiation only worked in python, and ergo training with other languages wasn't quite there. I think the language on the page was always similar to the above.
Well there's Gorgonia[0] (shameless promo: I wrote it). It's like TF/Theano. I'm finishing up porting/upgrading the CUDA related code from the older version (long story short: I needed a dependency parser and so I hacked on CUDA stuff and now I'm paying the price for not properly engineering it)