Hi Everyone! I'm glad to see so much interest in this Image in this amazing community. I'm the CTO of Observatorio El Sauce, the observatory where this telescope is located. This place is a fully robotic observatory that provides a service called "Telescope Hosting", which basically means that people send their telescopes and observe remotely from wherever they live.
Why would they do that instead of observing from their backyards?
That's because in that part of Chile we have the best sky quality for astronomy in the world, in terms of amount of clear nights a year (320 clera nights a year), light pollution (class 1 en Bortle scale), and in something called "seeing" (average below 1"), which is a measure of the smallest thing the sky would allow you to image (the smaller the better). Thus, from our observatory our clients get the best possible image they can get with their telescopes.
Also, a good friend of mine developed this digital scope so you can zoom in the picture easily without going back to the 90ties internet experience:
https://scope.avocco.com/case/20/eWKcUiIXpuQU9V0z
I'd be happy to answer to your questions :) Enjoy!
Cool service! Can you say anything on how many telescopes you host, and ballpark price, and any history of how the observatory got started (land, permits etc)?
Hi! Thanks :) We have already 30 telescopes for different purposes, mainly for science and astrophotography. Our standard plans, which include maintenance and most of the support you'd need, cost around 7500 usd/year.
The project started in 2012, we spent a couple of years finding the land, doing all the necessary paperwork and getting permits, getting internet, etc. Our first telescope arrived in the beginning of 2015.
Our goal is to be a professional alternative to the big scientific observatories, but mainly purposed for small and middle-sized telescopes.
Magnificent! Congrats to Ciel Austral. We live in a New Golden Age in astrophysics and astronomy. With M87 black hole, TESS exoplanet catalog, Charon flyby, LIGO-Virgo, etc. ;)
“In an eternally inflating universe, anything that can happen will happen; in fact, it will happen an infinite number of times. Thus, the question of what is possible becomes trivial—anything is possible […] The fraction of universes with any particular property is therefore equal to infinity divided by infinity—a meaningless ratio.” -Alan Guth
If you actually lived there it would probably just seem like a lot of uninteresting empty space. It only looks interesting because what you're looking at is far beyond huge, all squashed together into one picture.
“Space is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.”
Well, I think you could live a whole life on this planet without seeing the same place twice. If you think you have seen it all, you probably didn't look close enough.
I mean, you can see every place on this planet within a few seconds if you are taking a look at some satellite photo and yet you would not have seen your house and the living beings around it. Similarly, there are many more details which we do not perceive in our everyday life and which you can explore without have to invent a couple of scientific miracles ;-)
It is not that I don't value the stars. It is just that I think we shouldn't dream of arriving at some wonderful planet one day when we already are on a wonderful planet. We just don't appreciate it like that.
I think OP means that LMS isn't a place per-se (like a planet, or maybe even a solar system is a "place"). It's a whole bunch a places bound together by gravity.
A solar system is a whole bunch of places bound together by gravity, too. What's the fundamental distinction here? That the center of mass isn't approximately one massive thing (e.g. the solar system is 99% the center -- the sun)?
In a very shallow, simplistic sense, we see all these other amazing phenomena out there. Our home seems like a little rural backwater, crossroads and a store kind of thing, by comparison.
I opened the full image (14400x14200), which took a good minute to load, and spent some time just looking at every single dot in that picture, which there is a lot of.
Memories of Netscape, on dial up, waiting for the picture to load, but now the detail and resolution is so incredible. It's a great way to start a Monday!
I opened it in Chrome 73 & Preview on MacOS to see memory consumption on Activity Monitor.
When the full image was loaded up in Chrome, Chrome Helper showed up with ~1.30GB of memory consumption; it sometimes went up to 5.60 GB on repeated tests and quickly reverted to ~ 1.30 GB on average.
Preview consumed ~550MB at the beginning and further zooming /zooming out consumed ~250 MB at average.
Chrome, struggled a bit after clicking on Zoom. Preview gave a low resolution image when Zoom was clicked and then gave the original resolution. Preview's user experience was comparatively better.
Intel power gadget showed CPU spikes during these actions, but nominal GPU spikes; I don't think metal is being used in preview.
" Indeed, astrophotographers used a couple of special filters which transmit narrow parts -lines- of the visible spectrum : the Hydrogen Alpha line at 656 nm, the Sulfur line at 672 nm and the Oxygen III spectral line at 500 nm"
Can anyone comment on how much the images produced by these filters differ from what the human eye would see if somehow it was able to look at these objects. Are they also taking in information from the non visible spectrum and coloring it or is this all just a focusing of a light that real humans would have been able to perceive?
I know they mentioned using different filters to achieve the two different images but was
There are several parts to the answer of your question.
First of all, emission nebulas are not very bright, so no telescope can give a picture as bright as a long-time exposure does. If you can see a nebula through a telescope at all, it will be very faint. Which triggers another effect in your eye: the cells for color reception are not very sensitive. Like with general night vision, you will see nebulas usually only with your light-sensitive receptors, which don't see colors. So it will appear in a grey-greenish color.
High quality pictures of nebulas are taken at very specific wavelengths, of common emission frequencies, you listed them. Even at high brightness, they wouldn't directly convert into a good color picture, as 500nm is turquise, 656 and 672nm are very deep red. A color image converting these wavelengths directly into RGB-values would be not very impressive, it would look more like the bottom image on the page. So usually a color mapping is used to generate impressive images which also show a lot of the detail information. With 3 different "colors" in the source image, you can apply an arbitrary transformation to generate an RGB-image. For example, most images from the Hubble telescope use a common mapping which is consequently called the Hubble-telescope mapping.
Like shown on the page, you can create very different looking images from the same data set by choosing the color mapping.
Under excellent conditions visual experience is very similar to pictures. Emission nebula only emits light at very narrow band, and filter will suppress stars and makes nebula more contrast.
Short answer: the colors they mapped to green is actually closer to red and the color they mapped to blue is closer to green, so it would have less cyan (blue and green) and more magenta (blue and red). It would probably look a little more purple-ish.
It blows my mind to zoom in on just one part of the picture and see how many stars there are. And then multiply that by the entire picture. And then multiply that by all the galaxies in the universe. My mind just isn't built to comprehend numbers that large.
Actually, most of those dots are galaxies. The dots that make up the large structures are stars in the LMC which is a collection of star clusters, but everything else that is not part of some larger structure must be a galaxy.
Yeah, if universe doesn't make one humble, that probably nothing will. Due to recent photoshooting of M87 and its central black hole, I ended up on the page which lists the biggest black holes known to us [1], pretty humbling to read the details on some (ie quasars overshadowing its own whole galaxy so we see only it). Universe is surprisingly diverse
Maybe the designer believes the feature of scrolling to the banner image at the top is absolutely essential and should be done frequently! That's why they made it red and blink, it's very important!
This is DEM L316. It might seem like one object, but these are the remnants of two different supernovas (of different types: smaller is Type Ia, bigger Type II).
Does anybody know how expensive the setup described is?
- Remotely-controlled observatory at the El Sauce Observatory in Chile
- A 160-mm APO-refractor telescope and a Moravian CCD
- Presumably hefty image processing requirements
- etc.
I get that these guys are amateurs in that they are not being paid for this but presumably this costs some serious money? Or are the components they use in reach of a well to do hobbyist these days (all relative I know)?
Thats just the big ticket stuff. Theyll have a guidescope, colour filters, laptop (i assume), all sorts of paraphernalia supporting the effort.
Given that its remote controlled and in an observatory in Chile I suspect that adds another order of magnitude to the cost. But I'm unsure specifically how much, or if they're renting scope time.
You can buy much cheaper equipment and still do admirably, this set up is really quite extreme for a hobbyist.
Can someone give a quick explanation of the objects the picture? Are all the nebulas in the LMC or in the foreground? Is the LMC the reddish haze in the backround?
The nebula are regions where stars are forming in the LMC. In fact, the brightest and largest on the middle left, is the Tarantula Nebula, with the young forming star cluster 30 Doradus. 30 Dor is notable for being the biggest, baddest star forming region in the Milky Way or its satellites--it hosts a few hundred stars more massive than sixty times the mass of the sun in it's core, and hosts the candidates for the highest mass stars observed, above around 150 solar masses. If placed 100 times closer to be where the Orion Nebula is, its illumination would cast visible shadows, taking up a quarter of the night sky with a surface brightness on average that of Venus.
Your mass, and mine, are parties to the gravitational conspiracy, yes. One can even trivially prove that the Freemasons and the Roman Catholic Church are in on it. "We" is an appropriate pronoun in this instance, just as it would be for a nation of which you are a citizen.
The LMC is 160k light years away; 100 years ago 'we' didn't exist, the constituent matter that form us was there, but 'we'. Perhaps the twinkle in the eye, but no 'we' yet.
Since information and gravity movement is limited by the speed of light the effect of 'we' is by our age, so to a 100 light years for all but a very few people.
In this case, the "we" was a stand-in pronoun for the Milky Way galaxy, of which we are part. It would have been linguistically tortuous to have phrased it in any other way; "we" was the appropriate pronoun, no matter how much it may embarrass you to be implicated in the action. Natural languages are not instances or implementations of propositional calculus.
Well, that's what I'm doing tomorrow, but it will be for my living room. It's a 135x242cm piece of wall, which is a not as a square ratio as the image, so I had to crop it from its upper left corner down. Inkscape exported it to a 666MB PNG (ugh) I don't know why. So let's see...
It's one section only which is 15945x28583px. I'll have it printed at a shop near home which is specialized in large printings for cars and trucks. Their printings are like peel out stickers.
What kinds of image processing are used for this? Do the images need to be aligned, or can telescopes be pointed precisely enough? Are the images combined using mean/median, or something more sophisticated than that? What settings are the original photos captured with?
There are several software options on image processing for astrophotography. Do a search for 'astrophotography image stacking', and you'll get a list of software, tutorials, videos, etc. A couple of the popular ones are Deep Sky Stacker[0] or PixInsight[1] or even Photoshop. They offer different options/capabilities.
The main thing about the capture settings is to use RAW. Other settings ISO/exposure time/etc is dependent on camera being used. However, whatever you can do to capture as much light as possible within each frame is the goal.
The software does image alignment rotate/scale/etc to do the stacking. You can stack images taken of the same object from different physical locations. Spend a weekend in the desert shooting an object, then spend another weekend the next month at the top of a mountain shooting the same object, and all of the images can be stacked.
The telescope alignment precision is important, but less so than it used to be for a couple of reasons. With gear available today, you can take "portable" telescopes into the field, do a decent polar alignment and then allow the guide scope/software to correct for any imprecision of the main scope's alignment and even tracking issues from manufacturing issues with the mount's worm gear. A guide scope is a second smaller telescope (wider field of view) attached to the main scope with a camera attached to it. That camera is connected to a computer running the guide software, and will track a designated star. The guide software will talk to the telescope's motors, and can speed up/slow down the motors to keep the guide star to within a 1/4 pixel deviation.
Also, with digital cameras, images of shorter exposure times are taken and then stacked in software. There's multiple benefits to doing this. Consider exposing a single frame for 60 minutes, or 12 5 minute exposures, or 30 2 minute exposures. If anything bad happens during that exposure (a plane or a satellite crosses your view, someone uses a laser pointer through your frame of view, a bug lands on your primary, etc) it's not "that big of a deal" to capture it again. Also, digital camera sensors tend to get noisy with longer exposures due to heat build up around the sensor (a problem film cameras do not suffer).
> Also, digital camera sensors tend to get noisy with longer exposures due to heat build up around the sensor (a problem film cameras do not suffer).
Maybe worth pointing out that film has its own issues with long exposures, though. If I remember right, film's response to light isn't strictly linear with exposure time so you get less and less useful additional exposure as you expose longer.
Yes, with film there is the Schwarzschild-effect which causes the sensitivity and the color reaction vary with the exposure time.
While digital camera sensors usually pick up noise for long-time exposures, this is less an issue for astronomical cameras, because they fight this noise by cooling the CCD-chip. Usually the chip is cooled via a Peltier-element to temperatures below -20C, where thermal noise is very low.
According to the official website [1] (which is linked to from TFA), the images were processed using PixInsight, a program popular among amateur astronomers. This page: [2] explains how PixInsight performs image alignment; it turns out to be a pretty complex (and interesting) process. The same page also explains the process of merging images.
It is not possible to ever see a scene like this even if one were sitting in deep space is it? These sorts of images are the result long exposures, but a human would only see blackness and stars, and maybe some faint puffs of light here and there.
You wouldnt ever see the colors. They are far too dim without magnification. If you were standing in the cloud you would probably see it a little, like we see our galaxy as a blurry cloud, but only on the darkest nights.
There's some astrophotography that fills in bands of the wavelength we can't see with colours to give us the perception of being able to see gas clouds etc. I'm not sure if this is done here but it's probably worth mentioning that not all space images you see are realistic in terms of human visible wavelengths.
Then we should also mention that human color vision changes depending on light levels, with us being more sensitive to some colors than others. So when you over-expose an image you aren't just making it brighter but changing the ratios of perceived colors. (A big deal for eye witness reports of crime at night.) At very low levels our vision becomes essentially black and white.
With these hi-res images I'm always curious which of the stars actually belong to the galaxy and which ones are "noise", i.e., stars that are from our own galaxy "blurring" the view.
I think filtering out "local" stars should be very doable given ML/CV progress.
You won’t see anything like that with the naked eye, period. We can’t build up a composite of all photons our eye gathers over 1060 hours! There are places where you can get minimal light pollution and see amazing things though.
No, the best you can do unaided is just a general view of the Milky Way.
But if you're willing to accept some optical aids like a reflector and eye piece, a large amateur "light bucket" dobsonian telescope can unveil deep space objects to the naked eye.
I don't think it's possible to get anything like these photos though, the sensor is collecting light over a very long duration to present as a single image. The only way to get more light into your naked eye real-time is with more aperture, obviously there are practical limits there.
I've seen the Magellanic Clouds with my naked eye. I worked at an observatory in rural Argentina (location because light pollution). One night, I went out to one of the telescopes for emergency maintenance. When we got back out into the dark and hadn't turned on the headlamps of the car yet, the Milky Way stretched like a band across the sky, and you could see both Magellanic Clouds as
small but macroscopic objects, indeed looking like clouds.
This was among the most breathtaking things I've ever seen (the other being a particularly vivid showing of northern lights in Alaska). The southern hemisphere's sky is infinitely more exciting than the northern one.
Also: Does anyone here know what star that is? Is it one of the brightest stars in the night sky, or is it just super bright relative to everything else in this picture?
Also, a good friend of mine developed this digital scope so you can zoom in the picture easily without going back to the 90ties internet experience: https://scope.avocco.com/case/20/eWKcUiIXpuQU9V0z
I'd be happy to answer to your questions :) Enjoy!