Hello! We are Dr. Roman Berens, Prof. Alex Lupsasca, and Trevor Gravely (PhD Candidate) and we are physicists working at Vanderbilt University. We are excited to share Black Hole Vision: https://apps.apple.com/us/app/black-hole-vision/id6737292448.
Black Hole Vision simulates the gravitational lensing effects of a black hole and applies these effects to the video feeds from an iPhone's cameras. The application implements the lensing equations derived from general relativity (see https://arxiv.org/abs/1910.12881 if you are interested in the details) to create a physically accurate effect.
The app can either put a black hole in front of the main camera to show your environment as lensed by a black hole, or it can be used in "selfie" mode with the black hole in front of the front-facing camera to show you a lensed version of yourself.
There are several additional options you can select when using the app. The first lensing option you can select is "Static black hole". In this mode, we simulate a non-rotating (Schwarzschild) black hole. There are two submodes that change the simulated field-of-view (FOV): "Realistic FOV" and "Full FOV". The realistic FOV mode takes into account the finite FOV of the iPhone cameras, leading to a multi-lobed dark patch in the center of the screen. This patch includes both the "black hole shadow" (light rays that end up falling into the black hole) and "blind spots" (directions that lie outside the FOV of both the front-and-rear-facing cameras). The full FOV mode acts as if the cameras have an infinite FOV such that they cover all angles. The result is a single, circular black hole shadow at the center of the screen.
Next, you can select the "Kerr black hole" mode, which adds rotation (spin) to the black hole. Additionally, you can augment the rotational speed of the black hole (its spin, labeled "a" and given as a percentage of the maximal spin).
In a nutshell, the app computes a map from texture coordinate to texture coordinate. This map is itself stored as a texture --- to obtain the value of the map on texture coordinates (x,y), one samples the texture at (x,y) and the resulting float4 contains the outputs (x',y') as well as a status code.
When the user selects the "Static black hole" mode, this texture is computed on the GPU and cached. The "Kerr black hole" textures, however, have been precomputed in Mathematica, due to the need for double precision floating point math, which is not natively available in Apple's Metal shading language.
We hope you enjoy watching the world with Black Hole Vision and welcome any questions or feedback. If you like the app, please share it with your friends!
The code was written at Vanderbilt University by Trevor Gravely with input from Dr. Roman Berens and Prof. Alex Lupsasca. This project was supported by CAREER award PHY-2340457 and grant AST-2307888 from the National Science Foundation.
License: This app includes a port of the GNU Scientific Library's (GSL) implementation of Jacobi elliptic functions and the elliptic integrals to Metal. It is licensed under the GNU General Public License v3.0 (GPL-3.0). You can view the full license and obtain a copy of the source code at: https://github.com/graveltr/BlackHoleVision.
By any chance, was Andrew Strominger involved in this at all? He gave the Andrew Chamblin Memorial Lecture in Cambridge last month and demoed something that looked similar.
I think what he showed you was likely a version of this that was coded up by Harvard graduate student Dominic Chang:
https://dominic-chang.com/bhi-filter/
It works very well (and in a browser!) but is limited to a non-rotating (Schwarzschild) black hole---we really wanted to include black hole spin (the Kerr case). As we write on the github, talking with Dominic about his implementation was very useful and we are hoping to get a paper explaining both codes out before the end of the year.
I feel like this app could also be an app clip to make it so that you don’t have to outright install the app to use it: https://developer.apple.com/app-clips/
It looks like nothing actually disappears. I expected a black hole to not just affect what an area looked like, but also to “disappear” some part of what was there.
I think that’s why this demonstration is interesting. It’s showing how the light can be bent around the black hole. Anything that crosses the event horizon won’t be coming back, but because of the lensing of the light you can “see” behind a black hole.
So if I’m understanding correctly, the black hole is supposed to be between me and what I’m looking at, not in what I’m looking at?
If so, then my question is wouldn’t some light be lost to the black hole? Shouldn’t a substantial portion of the light coming at me from the other side of the black hole disappear into the black hole, making what does lens around dimmer?
A lot of light would be absorbed by the black hole. A lot of light paths would be bent and miss or nearly miss the black hole, making edges of the black hole quite bright. The dimming effect would be much larger than the (brighter) immediate periphery.
Because, for an external observer, time infinitely slows down near the event horizon. In other words, during one hour by the clock of the far-away observer, the time that passes by the clock of the falling observer approaches zero as he approaches the event horizon. So, when you look from the outside, objects get 'frozen' as they approach the event horizon. For the falling observer, nothing special happens at the event horizon, and he just falls through.
If you happen to approach the event horizon closely and come back again far away to where you started, you will see that a lot of time passed at your origin, while by your clock, the trip might have been short.
As far as I can tell, the black hole's you're generating don't look especially correct in the preview: they should have a circular shadow like this https://i.imgur.com/zeShgrx.jpeg
What the black hole looks like depends on how you define your field of view. And if the black hole is spinning, then you don't expect a circular shadow at all. But in our app, if you pick the "Static black hole" (the non-rotating, Schwarzschild case) and select the "Full FOV" option, then you will see the circular shadow that you expect.
Yes but the preview has 'realistic FOV' selected not 'full'. And the rotating black hole does have the same shadow as your image if you turn the rotation speed up.
> "Gravity as a fluid dynamic phenomenon in a superfluid quantum space. Fluid quantum gravity and relativity." (2015) https://hal.science/hal-01248015/ :
> FWIU: also rejects a hard singularity boundary, describes curl and vorticity in fluids (with Gross-Pitaevskii,), and rejects antimatter.
Actual observations of black holes;
"This image shows the observed image of M87's black hole (left) the simulation obtained with a General Relativistic Magnetohydrodynamics model, blurred to the resolution of the Event Horizon Telescope [...]" https://www.reddit.com/r/space/comments/bd59mp/this_image_sh...
> This is GIZMO: a flexible, multi-method multi-physics code. The code solves the fluid using Lagrangian mesh-free finite-volume Godunov methods (or SPH, or fixed-grid Eulerian methods), and self-gravity with fast hybrid PM-Tree methods and fully-adaptive resolution. Other physics include: magnetic fields (ideal and non-ideal), radiation-hydrodynamics, anisotropic conduction and viscosity, sub-grid turbulent diffusion, radiative cooling, cosmological integration, sink particles, dust-gas mixtures, cosmic rays, degenerate equations of state, galaxy/star/black hole formation and feedback, self-interacting and scalar-field dark matter, on-the-fly structure finding, and more.
Neat. I'll probably use it for five minutes, appreciate the math that went into it, and move on. But nevertheless, pretty neat.
I say that because there's an idea to play with for a v1.1 that would give it staying power for me:
Do you have enough processing power on an iPhone to combine this with Augmented Reality? That is to say: can you explore "pinning" a singularity in a fixed region of space so I can essentially walk around it using the phone?
Assuming that's possible, you could continue evolving this into a very modest revenue generating app (like 2 bucks per year, see where it goes?) by allowing for people to pin singularities, neutron stars, etc. around their world and selectively sharing those with others who pass by. I'd have fun seeing someone else's pinned singularity next to the Washington monument, for instance. Or generally being able to play with gravity effects on light via AR.
Commenting to reinforce this idea: I'd love an AR approach where I can pin a black hole with a given radius into my living room, and walk around it!
The geosharing augmented reality thing mentioned by the parent comment is very very cool too, I'd pay a few bucks for that! Maybe make it social by letting black holes that people drop somewhere IRL merge, etc...
Reach out to me if you eventually would like to spin up a cheap bit of infrastructure to host the data of where people dropped their black holes, and need some help with that!
It would be neat to also get stats about the black hole depending on where you are in relation to it (obviously this breaks physics as a micro black hole would immediately fall into the earth). Everything is based on the hawking radiation calculator: https://www.vttoth.com/CMS/physics-notes/311-hawking-radiati...
Example: Set mass of black hole to 1e12 metric tons, or about 100,000 great pyramids.
This has a schwarzschild radius of 1485 femtometers (1 femtometer is around size of a proton).
Nominal luminosity is 356 watts. You could power your computer! Lifetime is 1e12 gigayears.
An interesting thing comes with gravity. Gravity at the schwarzschild radius for this mass is 3e28 m/s^2, but this is at a smaller-than-an-atom radius.
If you put your hand within a foot of it, gravity would be 700,000 m/s^2.
You would need to be at a distance of 270ft to experience gravity from it that compares to earth (9.8 m/s^2).
That is 356 watts of luminosity from something so small?! Whoa! It says the peak of the radiation has an energy of 41 keV though, so better not look at it directly (:
I tried plugging in some other numbers and, at first confusingly, found that the luminosity goes up at lower masses?! But of course it radiates from it's outer shell, not the entire volume.
Wonderful tool, imagine playing with those parameters in AR
Yes, this is one of the wonderful crazy properties of black holes: they get hotter as they evaporate! (More precisely, the Hawking temperature is inversely proportional to the mass!)
It's crazy how hot and luminous they get. At 45 seconds left in a black hole's life, it has the luminosity of 85,000 megatons of TNT, and only gets exponentially hotter as those 45 seconds count down. In the last fraction of a second of it's life, with one metric ton of mass left, its luminosity is greater than the sun.
That's an excellent idea! And indeed, part of the reason we started with the iPhone is because we've been thinking from the get-go about an eventual extension to Apple Vision Pro. As I wrote in my other comment, this is part of an outreach effort to get the public (and students) excited about black hole physics, so we will always keep the code free and open source.
You need a full 3D scan of the environment of everything the black hole can "see" from the position you want to put it in, not just the traditional "augmented reality" that sits on top of a current camera feed, because black holes are also essentially 360 degree cameras that from some angle will let you see anything around them. Not impossible, but harder than "just" taking an augmented reality feed.
Yikes no. Cash generation's a proxy for capturing human value, maybe. If we run with that:
- value can be generated, but not captured (generally good-natured humans do this constantly with those in their communities), and
- value can also be captured, but not generated (i.e. stolen, most of the largest corporations do this in one way or another via e.g. monopolization, political corruption, union busting, resource exploitation, real estate speculation, etc).
Let me give you an example of how backwards that is:
Are you telling me that, for example, Linus Torvalds (or any major contributor to Linux) has generated less long term human value than a congressperson like Rick Scott or Mark Werner?
Linux runs on machines that literally keep people alive as well as that are used to create and display works of art.
I recommend using a different preview screenshot on the App Store page. The first (most important) screenshot is without the effect at all. The use of the galaxy image doesn’t really reflect what it’s like to use the app.
The rotating black hole version that we implemented requires GPU code, and porting that to Android is nontrivial---though we'd love it if someone took our open-source code and ported it!
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser:
https://dominic-chang.com/bhi-filter/
First thing I wondered is what would happen if I pointed it another screen, with an image like this loaded. I realize that it's not realistic due to the z-axis, and field of view, but it's pretty fun.
Just tried to check it out. First boot it crashed, killed app and tried again and now it won’t open. I’ll try and reinstall and do over. iPhone 16 Pro, iOS 18.1
Quick edit- I did exactly that and now it works fine. First boot up before seemed like it got stuck when asking for permission to use the camera.
Glad it worked on second boot! We used to have some bugs in the elliptic integral implementation that led to the app crashing, but we think we've eliminated those, so hopefully this is just a fluke... Anyone else with this issue?
Glad to hear that! You'll probably also enjoy reading about the Black Hole Explorer (BHEX): a proposed space mission that will take the sharpest images in the history of astronomy and resolve the "photon ring" of orbiting light around a black hole.
https://www.blackholeexplorer.org/
Did anybody else first think —before seeing the app images— that it was somehow using the camera of the iPhone to simulate the physics of the black hole?
This is awesome. I see that this is GPL and open on GitHub. Thank you for sharing. If you are open to feature requests that I am too lazy and stupid to accomplish on my own, I would appreciate the option to drop the multi camera view and the option to capture a photo.
Also plus one to the idea of being able to pin the black hold to a specific orientation so you can see what it looks like to pan around an object adjacent to the black hole.
Adding options to drop the multi-camera view and to capture a screenshot is relatively straightforward, and I think we can implement that in the next update. Pinning the black hole to a specific place is a whole other undertaking...
Thanks for responding. Pining in space requires some tricky 3D math so I get that it is a pain. The multi view thing is also kind of cool since it shows how space distorts. I think I really just wanted to share the warped image so a capture button for the distorted image is really all I need.
Not related to the app, but could someone explain how something with huge though finite mass can create a singularity which is a point of infinite density? Can there also be black holes which are just dark stars with intense gravity having a hard surface?
That seems cool. It would be interested to see a simulation for Kerr-Newman BH. Although I have no idea what would be the best way to see the effects without some sort of perturbation. Not that this is astrophysical BH of course. Just a thought experiment.
I haven't looked into the equations but I expect the effect of varying the electric charge would be similar to (but less dramatic than) changing the spin of the black hole, which as you can see in our app is not that big of a change.
Yes, but one issue is that the amount of redshift depends on the motion of the emitter, so we would have to artificially assign some four-velocity to your surroundings in order to give them some redshift. There doesn't seem to be a "natural" choice for how to do this.
TLDR: redshift depends not only on the position of the source, but also its velocity.
Since you don't notice any red-shift with your eyes in daily life, why is zero velocity relative to the camera not a natural choice? Or maybe I'm not following you?
If you put the source infinitely far away (as we are doing here) and at zero velocity relative to the camera, then there is no redshift effect at all, so you could say that this is the choice of redshift that we made :)
If you want the details, they're too long to put in this comment but essentially what I mean is that the r->infty limit of the redshift factor in Eq. (B22) of this paper is unity: https://arxiv.org/abs/2211.07469
Maybe my original question wasn't specific enough? So you are saying there is no "extra" red-shifting due to the gravitational effects of the black hole for light coming from behind the black hole? Some photons skim "close" to the event horizon, and have to climb out of the gravitational well, but they don't end up red-shifted for the camera (because maybe they were blue-shifted on the way in from infinity?). Some photons sufficiently close to the event horizon will even do an orbit around the black hole (one or more times) before reaching the camera. And those photons that made the orbit maybe do have "extra" red-shift, because they are essentially no longer coming from infinity? Maybe? ???
Well technically it approaches infinite gravity. It's a gravitational asymptote. But like the other commenter said, no way to know what it actually is in reality, as we only have mathematical concepts that may or may not match reality.
We wanted the app to work on an iPhone and that required the use of Apple Metal code. This could of course be ported to a desktop but we're not sure there would be much interest in that?
I have once seen a video of Kip Thorne, explaining that the black hole visual effects of Interstellar were an actual physical simulation. I wouldn't have thought, that it was feasible to run on an iPhone.
The black hole simulation that was shown in the movie Interstellar is explained in detail in this paper, freely available on the arXiv:
https://arxiv.org/abs/1502.03808
As a physicist with a modest background in computing, I was also surprised by how powerful the iPhone GPU is. It can indeed lens the input from the camera at high resolution and in real time with high FPS.
It looks needlessly complicated and messy because the visually interesting region when rotation is turned on is blocked out by the FOV cutouts. We felt it was best to only allow the user to select the full FOV in this mode.
Not currently. As I mentioned elsewhere, the rotating black hole version that we implemented requires GPU code, and porting that to Android is nontrivial---though we'd love it if someone took our open-source code and ported it!
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser: https://dominic-chang.com/bhi-filter/
Pendatic but can I ask why does this app require 17.5 or later? For reference, the latest iOs version is 18. What specific API is being used to require that version?
I wonder if this would be better as a webpage and not an app. It’s something I want to share far and wide for everyone to play with for five minutes. But as an app, most people I send it to won’t go install it.
I’m no astrophysicist but it all looks doable with the camera API, canvas API, and WebGL or WebGPU shaders. That actually sounds like a lot of fun.
As Project Scientist for BHEX, I am of course excited about the project and eager to spread the word about it! But as I wrote in my other comment, what this is really trying to "sell" is gravitational physics to students interested in black holes, and this effort is supported in part by the National Science Foundation.
Does anyone else find it jarring to unexpectedly be shown the selfie camera view? Showing both camera feed thumbnails constantly while using this app is a little odd.
Still, kinda fun, reminds me of playing around with different blur / liquidify filters in photoshop back in the day.
That would be cool, but then you wouldn't be seeing the world around you anymore. In other words, at that point it becomes a GRMHD simulation, and there is no point in using cameras since the user's environment is obscured, no? Or did you have something else in mind?
Yes, agreed. We thought it would be fair to call it a "simulation" of what your surroundings would look like if a black hole were within your FOV, but as you say we do not take into account all effects (time delays in particular would require a lot of buffering and we decided this would be impractical to implement, and not that illuminating).
This is still nice when there are so many artistic images of black holes that do not take such care to use known physics to create an accurate image. Well done all. Looking forward to seeing what BHEX sees.
About ~5x improvement. Recall that the resolution of an interferometric array is set by the distance between telescopes measured in units of the observation wavelength. BHEX will get a ~3x resolution improvement from the increased distance between the space satellite and our ground telescopes (for the EHT, the max telescope separation is limited by the diameter of the Earth) and another ~50% from the increased frequency of observations (going up from 230 to 320 GHz).
5x is actually a lot: we'll be able to resolve the "photon ring" of orbiting light around M87* and Sgr A* (the two black holes previously imaged by EHT at lower resolution) and likely see the "shadows" of another 6-8 black holes, with the possibility of estimating the mass of another ~20-30 sources.
Yes, such a great game---it's a fantastic visualization of special relativity and also fun to play!
I think we're still a ways off from real time GRMHD sims, but CK Chan from UArizona had a working VR simulation (on the Oculus iirc, but now deprecated) that allowed you to explore a pre-existing GRMHD simulation in real time and in 3D. I think he might be working on a new version of this.
That's awesome. It's extra crucial to have engaging outreach when your research is so far from application. There's so much scope for wowing people with astro and if you can enrich our culture and justify funding at the same time that's a win-win.
(Just for clarity the second R in GRRMHD is for radiation. I know it's typical to just push some photons through the GRMHD results to produce renders, bit since I'm dreaming let's treat the radiation self-consistently.)
Touché. While we're at it there are almost certainly some non-negligible QED processes. I guess there wouldn't be jobs for physicists if this stuff was straightforward.
I've been able to port NR to GPU's which with sufficiently powerful hardware can run simulations at about ~30fps with raytracing, to simulate binary black hole collisions. You need something around a top end consumer gpu at the moment. Phone hardware needs a while to catch up, there's an absolute minimum memory requirement of ~8gb vram, and you need a lot more bandwidth than they currently support
Hello! We are Dr. Roman Berens, Prof. Alex Lupsasca, and Trevor Gravely (PhD Candidate) and we are physicists working at Vanderbilt University. We are excited to share Black Hole Vision: https://apps.apple.com/us/app/black-hole-vision/id6737292448.
Black Hole Vision simulates the gravitational lensing effects of a black hole and applies these effects to the video feeds from an iPhone's cameras. The application implements the lensing equations derived from general relativity (see https://arxiv.org/abs/1910.12881 if you are interested in the details) to create a physically accurate effect.
The app can either put a black hole in front of the main camera to show your environment as lensed by a black hole, or it can be used in "selfie" mode with the black hole in front of the front-facing camera to show you a lensed version of yourself.
There are several additional options you can select when using the app. The first lensing option you can select is "Static black hole". In this mode, we simulate a non-rotating (Schwarzschild) black hole. There are two submodes that change the simulated field-of-view (FOV): "Realistic FOV" and "Full FOV". The realistic FOV mode takes into account the finite FOV of the iPhone cameras, leading to a multi-lobed dark patch in the center of the screen. This patch includes both the "black hole shadow" (light rays that end up falling into the black hole) and "blind spots" (directions that lie outside the FOV of both the front-and-rear-facing cameras). The full FOV mode acts as if the cameras have an infinite FOV such that they cover all angles. The result is a single, circular black hole shadow at the center of the screen.
Next, you can select the "Kerr black hole" mode, which adds rotation (spin) to the black hole. Additionally, you can augment the rotational speed of the black hole (its spin, labeled "a" and given as a percentage of the maximal spin).
In a nutshell, the app computes a map from texture coordinate to texture coordinate. This map is itself stored as a texture --- to obtain the value of the map on texture coordinates (x,y), one samples the texture at (x,y) and the resulting float4 contains the outputs (x',y') as well as a status code.
When the user selects the "Static black hole" mode, this texture is computed on the GPU and cached. The "Kerr black hole" textures, however, have been precomputed in Mathematica, due to the need for double precision floating point math, which is not natively available in Apple's Metal shading language.
The source code, including the Mathematica notebook, can be found here https://github.com/graveltr/BlackHoleVision.
We hope you enjoy watching the world with Black Hole Vision and welcome any questions or feedback. If you like the app, please share it with your friends!
The code was written at Vanderbilt University by Trevor Gravely with input from Dr. Roman Berens and Prof. Alex Lupsasca. This project was supported by CAREER award PHY-2340457 and grant AST-2307888 from the National Science Foundation.
License: This app includes a port of the GNU Scientific Library's (GSL) implementation of Jacobi elliptic functions and the elliptic integrals to Metal. It is licensed under the GNU General Public License v3.0 (GPL-3.0). You can view the full license and obtain a copy of the source code at: https://github.com/graveltr/BlackHoleVision.
By any chance, was Andrew Strominger involved in this at all? He gave the Andrew Chamblin Memorial Lecture in Cambridge last month and demoed something that looked similar.
I think what he showed you was likely a version of this that was coded up by Harvard graduate student Dominic Chang: https://dominic-chang.com/bhi-filter/
It works very well (and in a browser!) but is limited to a non-rotating (Schwarzschild) black hole---we really wanted to include black hole spin (the Kerr case). As we write on the github, talking with Dominic about his implementation was very useful and we are hoping to get a paper explaining both codes out before the end of the year.
[flagged]
Yes, Andy has been very involved in the story of the photon ring and was one of the lead authors on the original paper that started it all: https://www.science.org/doi/10.1126/sciadv.aaz1310
(And he was also my PhD advisor.)
I feel like this app could also be an app clip to make it so that you don’t have to outright install the app to use it: https://developer.apple.com/app-clips/
I’m confused by what I see.
It looks like nothing actually disappears. I expected a black hole to not just affect what an area looked like, but also to “disappear” some part of what was there.
I think that’s why this demonstration is interesting. It’s showing how the light can be bent around the black hole. Anything that crosses the event horizon won’t be coming back, but because of the lensing of the light you can “see” behind a black hole.
So if I’m understanding correctly, the black hole is supposed to be between me and what I’m looking at, not in what I’m looking at?
If so, then my question is wouldn’t some light be lost to the black hole? Shouldn’t a substantial portion of the light coming at me from the other side of the black hole disappear into the black hole, making what does lens around dimmer?
Yes some light would be lost the black hole, but also some light you would not have normally seen is now coming your way due to space time warping.
When you say "normally," do you mean all else being equal except no black hole? Or substitute an opaque mass for the black hole?
Here's Veritasium video on Gravitational Lensing effect: https://www.youtube.com/watch?v=zUyH3XhpLTo
A lot of light would be absorbed by the black hole. A lot of light paths would be bent and miss or nearly miss the black hole, making edges of the black hole quite bright. The dimming effect would be much larger than the (brighter) immediate periphery.
Because, for an external observer, time infinitely slows down near the event horizon. In other words, during one hour by the clock of the far-away observer, the time that passes by the clock of the falling observer approaches zero as he approaches the event horizon. So, when you look from the outside, objects get 'frozen' as they approach the event horizon. For the falling observer, nothing special happens at the event horizon, and he just falls through.
If you happen to approach the event horizon closely and come back again far away to where you started, you will see that a lot of time passed at your origin, while by your clock, the trip might have been short.
As far as I can tell, the black hole's you're generating don't look especially correct in the preview: they should have a circular shadow like this https://i.imgur.com/zeShgrx.jpeg
What the black hole looks like depends on how you define your field of view. And if the black hole is spinning, then you don't expect a circular shadow at all. But in our app, if you pick the "Static black hole" (the non-rotating, Schwarzschild case) and select the "Full FOV" option, then you will see the circular shadow that you expect.
The preview shows that you have a static black hole selected
This shape of the shadow is also wrong for kerr though, this is what kerr looks like:
https://i.imgur.com/3cS1fNI.png
Yes but the preview has 'realistic FOV' selected not 'full'. And the rotating black hole does have the same shadow as your image if you turn the rotation speed up.
"static" is "Shwarzchild without rotation"?
Do black holes have hair?
Where is the Hawking radiation in these models? Does it diffuse through the boundary and the outer system?
What about black hole jets?
What about vortices? With Gross-Pitaevskii and SQR Superfluid Quantum Relativity
https://westurner.github.io/hnlog/ Ctrl-F Fedi , Bernoulli, Gross-Pitaevskii:
> "Gravity as a fluid dynamic phenomenon in a superfluid quantum space. Fluid quantum gravity and relativity." (2015) https://hal.science/hal-01248015/ :
> FWIU: also rejects a hard singularity boundary, describes curl and vorticity in fluids (with Gross-Pitaevskii,), and rejects antimatter.
Actual observations of black holes;
"This image shows the observed image of M87's black hole (left) the simulation obtained with a General Relativistic Magnetohydrodynamics model, blurred to the resolution of the Event Horizon Telescope [...]" https://www.reddit.com/r/space/comments/bd59mp/this_image_sh...
"Stars orbiting the black hole at the heart of the Milky Way" ESO. https://youtube.com/watch?v=TF8THY5spmo&
"Motion of stars around Sagittarius A*" Keck/UCLA. https://youtube.com/shorts/A2jcVusR54E
/? M87a time lapse
/? Sagittarius A time lapse
/? black hole vortex dynamics
"Cosmic Simulation Reveals How Black Holes Grow and Evolve" (2024) https://www.caltech.edu/about/news/cosmic-simulation-reveals...
"FORGE’d in FIRE: Resolving the End of Star Formation and Structure of AGN Accretion Disks from Cosmological Initial Conditions" (2024) https://astro.theoj.org/article/94757-forge-d-in-fire-resolv...
STARFORGE
GIZMO: http://www.tapir.caltech.edu/~phopkins/Site/GIZMO.html .. MPI+OpenMP .. Src: https://github.com/pfhopkins/gizmo-public :
> This is GIZMO: a flexible, multi-method multi-physics code. The code solves the fluid using Lagrangian mesh-free finite-volume Godunov methods (or SPH, or fixed-grid Eulerian methods), and self-gravity with fast hybrid PM-Tree methods and fully-adaptive resolution. Other physics include: magnetic fields (ideal and non-ideal), radiation-hydrodynamics, anisotropic conduction and viscosity, sub-grid turbulent diffusion, radiative cooling, cosmological integration, sink particles, dust-gas mixtures, cosmic rays, degenerate equations of state, galaxy/star/black hole formation and feedback, self-interacting and scalar-field dark matter, on-the-fly structure finding, and more.
Hey, thanks for not collecting personal data for no good reason!
Neat. I'll probably use it for five minutes, appreciate the math that went into it, and move on. But nevertheless, pretty neat.
I say that because there's an idea to play with for a v1.1 that would give it staying power for me:
Do you have enough processing power on an iPhone to combine this with Augmented Reality? That is to say: can you explore "pinning" a singularity in a fixed region of space so I can essentially walk around it using the phone?
Assuming that's possible, you could continue evolving this into a very modest revenue generating app (like 2 bucks per year, see where it goes?) by allowing for people to pin singularities, neutron stars, etc. around their world and selectively sharing those with others who pass by. I'd have fun seeing someone else's pinned singularity next to the Washington monument, for instance. Or generally being able to play with gravity effects on light via AR.
Commenting to reinforce this idea: I'd love an AR approach where I can pin a black hole with a given radius into my living room, and walk around it!
The geosharing augmented reality thing mentioned by the parent comment is very very cool too, I'd pay a few bucks for that! Maybe make it social by letting black holes that people drop somewhere IRL merge, etc...
Reach out to me if you eventually would like to spin up a cheap bit of infrastructure to host the data of where people dropped their black holes, and need some help with that!
It would be neat to also get stats about the black hole depending on where you are in relation to it (obviously this breaks physics as a micro black hole would immediately fall into the earth). Everything is based on the hawking radiation calculator: https://www.vttoth.com/CMS/physics-notes/311-hawking-radiati...
Example: Set mass of black hole to 1e12 metric tons, or about 100,000 great pyramids.
This has a schwarzschild radius of 1485 femtometers (1 femtometer is around size of a proton).
Nominal luminosity is 356 watts. You could power your computer! Lifetime is 1e12 gigayears.
An interesting thing comes with gravity. Gravity at the schwarzschild radius for this mass is 3e28 m/s^2, but this is at a smaller-than-an-atom radius.
If you put your hand within a foot of it, gravity would be 700,000 m/s^2.
You would need to be at a distance of 270ft to experience gravity from it that compares to earth (9.8 m/s^2).
That is 356 watts of luminosity from something so small?! Whoa! It says the peak of the radiation has an energy of 41 keV though, so better not look at it directly (:
I tried plugging in some other numbers and, at first confusingly, found that the luminosity goes up at lower masses?! But of course it radiates from it's outer shell, not the entire volume.
Wonderful tool, imagine playing with those parameters in AR
Yes, this is one of the wonderful crazy properties of black holes: they get hotter as they evaporate! (More precisely, the Hawking temperature is inversely proportional to the mass!)
It's crazy how hot and luminous they get. At 45 seconds left in a black hole's life, it has the luminosity of 85,000 megatons of TNT, and only gets exponentially hotter as those 45 seconds count down. In the last fraction of a second of it's life, with one metric ton of mass left, its luminosity is greater than the sun.
That's an excellent idea! And indeed, part of the reason we started with the iPhone is because we've been thinking from the get-go about an eventual extension to Apple Vision Pro. As I wrote in my other comment, this is part of an outreach effort to get the public (and students) excited about black hole physics, so we will always keep the code free and open source.
You need a full 3D scan of the environment of everything the black hole can "see" from the position you want to put it in, not just the traditional "augmented reality" that sits on top of a current camera feed, because black holes are also essentially 360 degree cameras that from some angle will let you see anything around them. Not impossible, but harder than "just" taking an augmented reality feed.
It could be done in VR instead, where the entire environment is available.
Not everything needs to generate cash :)
Generating cash is a proxy for generating long term human value
Yikes no. Cash generation's a proxy for capturing human value, maybe. If we run with that:
- value can be generated, but not captured (generally good-natured humans do this constantly with those in their communities), and
- value can also be captured, but not generated (i.e. stolen, most of the largest corporations do this in one way or another via e.g. monopolization, political corruption, union busting, resource exploitation, real estate speculation, etc).
Please really think hard about what that means.
Let me give you an example of how backwards that is:
Are you telling me that, for example, Linus Torvalds (or any major contributor to Linux) has generated less long term human value than a congressperson like Rick Scott or Mark Werner?
Linux runs on machines that literally keep people alive as well as that are used to create and display works of art.
Not everything needs to generate long term human value. It's okay to just have fun, too.
Like some intensive cash-generating activities: drug dealing, weapon traficking, stealing, and money printing. Value at its best.
What a warped thing to believe in
Under what assumptions?
Just trying to guess at what they could be is costing me random time...
[dead]
[flagged]
I recommend using a different preview screenshot on the App Store page. The first (most important) screenshot is without the effect at all. The use of the galaxy image doesn’t really reflect what it’s like to use the app.
Very nice – if only I could try it! :'-) Any chance this could be ported to Android, at least for high-end devices with a decent GPU?
The rotating black hole version that we implemented requires GPU code, and porting that to Android is nontrivial---though we'd love it if someone took our open-source code and ported it!
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser: https://dominic-chang.com/bhi-filter/
First thing I wondered is what would happen if I pointed it another screen, with an image like this loaded. I realize that it's not realistic due to the z-axis, and field of view, but it's pretty fun.
https://esahubble.org/images/heic0609a/
Just tried to check it out. First boot it crashed, killed app and tried again and now it won’t open. I’ll try and reinstall and do over. iPhone 16 Pro, iOS 18.1
Quick edit- I did exactly that and now it works fine. First boot up before seemed like it got stuck when asking for permission to use the camera.
Glad it worked on second boot! We used to have some bugs in the elliptic integral implementation that led to the app crashing, but we think we've eliminated those, so hopefully this is just a fluke... Anyone else with this issue?
> Data Not Collected
> The developer does not collect any data from this app.
Well, duuh, nothing can escape the black hole, not even information!
>physically accurate >event horizon doesn't appear in my room :(
Something to rejoice about, no? ;)
Instant download for me. I’m a sucker for anything black hole related.
Glad to hear that! You'll probably also enjoy reading about the Black Hole Explorer (BHEX): a proposed space mission that will take the sharpest images in the history of astronomy and resolve the "photon ring" of orbiting light around a black hole. https://www.blackholeexplorer.org/
I had no idea! Thank you!!
Did anybody else first think —before seeing the app images— that it was somehow using the camera of the iPhone to simulate the physics of the black hole?
It does
This is awesome. I see that this is GPL and open on GitHub. Thank you for sharing. If you are open to feature requests that I am too lazy and stupid to accomplish on my own, I would appreciate the option to drop the multi camera view and the option to capture a photo. Also plus one to the idea of being able to pin the black hold to a specific orientation so you can see what it looks like to pan around an object adjacent to the black hole.
Adding options to drop the multi-camera view and to capture a screenshot is relatively straightforward, and I think we can implement that in the next update. Pinning the black hole to a specific place is a whole other undertaking...
Thanks for responding. Pining in space requires some tricky 3D math so I get that it is a pain. The multi view thing is also kind of cool since it shows how space distorts. I think I really just wanted to share the warped image so a capture button for the distorted image is really all I need.
Not related to the app, but could someone explain how something with huge though finite mass can create a singularity which is a point of infinite density? Can there also be black holes which are just dark stars with intense gravity having a hard surface?
1. Division by zero
2. Neutron stars I think
That seems cool. It would be interested to see a simulation for Kerr-Newman BH. Although I have no idea what would be the best way to see the effects without some sort of perturbation. Not that this is astrophysical BH of course. Just a thought experiment.
I haven't looked into the equations but I expect the effect of varying the electric charge would be similar to (but less dramatic than) changing the spin of the black hole, which as you can see in our app is not that big of a change.
As above so below. I love how it looks so similar to a colonoscopy.
There are other apps out there for this kind of black hole vision...
Dude, I like your comments.
Would a person notice red-shifts from the black hole as well?
Yes, but one issue is that the amount of redshift depends on the motion of the emitter, so we would have to artificially assign some four-velocity to your surroundings in order to give them some redshift. There doesn't seem to be a "natural" choice for how to do this.
TLDR: redshift depends not only on the position of the source, but also its velocity.
Since you don't notice any red-shift with your eyes in daily life, why is zero velocity relative to the camera not a natural choice? Or maybe I'm not following you?
If you put the source infinitely far away (as we are doing here) and at zero velocity relative to the camera, then there is no redshift effect at all, so you could say that this is the choice of redshift that we made :)
If you want the details, they're too long to put in this comment but essentially what I mean is that the r->infty limit of the redshift factor in Eq. (B22) of this paper is unity: https://arxiv.org/abs/2211.07469
Maybe my original question wasn't specific enough? So you are saying there is no "extra" red-shifting due to the gravitational effects of the black hole for light coming from behind the black hole? Some photons skim "close" to the event horizon, and have to climb out of the gravitational well, but they don't end up red-shifted for the camera (because maybe they were blue-shifted on the way in from infinity?). Some photons sufficiently close to the event horizon will even do an orbit around the black hole (one or more times) before reaching the camera. And those photons that made the orbit maybe do have "extra" red-shift, because they are essentially no longer coming from infinity? Maybe? ???
Is in the middle of black hole zero gravity? Then, is there another event horizon somewhere inside black hole?
I don’t think anybody really knows what‘s inside a black hole. That’s kind of their defining property.
Of course we do - everything that fell in but wasn't radiated out ;)
Well technically it approaches infinite gravity. It's a gravitational asymptote. But like the other commenter said, no way to know what it actually is in reality, as we only have mathematical concepts that may or may not match reality.
So cool! Vision Pro version?
We're thinking about it! Likely 2025 though.
Does it use iPhone-specific features or could it work on, e.g., a desktop
We wanted the app to work on an iPhone and that required the use of Apple Metal code. This could of course be ported to a desktop but we're not sure there would be much interest in that?
Maybe WebGPU would be a good porting target.
Really cool app btw!
I have once seen a video of Kip Thorne, explaining that the black hole visual effects of Interstellar were an actual physical simulation. I wouldn't have thought, that it was feasible to run on an iPhone.
The black hole simulation that was shown in the movie Interstellar is explained in detail in this paper, freely available on the arXiv: https://arxiv.org/abs/1502.03808
As a physicist with a modest background in computing, I was also surprised by how powerful the iPhone GPU is. It can indeed lens the input from the camera at high resolution and in real time with high FPS.
Cool, thanks for the reference!
I was able to install it on my M1 Mac, fwiw.
Excellent! Hope it looked cool.
What happens with the rotating one and a realistic POV?
It looks needlessly complicated and messy because the visually interesting region when rotation is turned on is blocked out by the FOV cutouts. We felt it was best to only allow the user to select the full FOV in this mode.
Thanks for the question!
No plans for an Android version?
Not currently. As I mentioned elsewhere, the rotating black hole version that we implemented requires GPU code, and porting that to Android is nontrivial---though we'd love it if someone took our open-source code and ported it!
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser: https://dominic-chang.com/bhi-filter/
A bit off topic, but does anyone else get weird nightmares of black holes?
Pendatic but can I ask why does this app require 17.5 or later? For reference, the latest iOs version is 18. What specific API is being used to require that version?
Good point, the minimum version should be an earlier version of iOS, we don’t use any APIs that are only available in 17.5 or later.
Thanks for pointing that out.
Glad to hear, blocked install for me, so a bit more than pedantic :)
I wonder if this would be better as a webpage and not an app. It’s something I want to share far and wide for everyone to play with for five minutes. But as an app, most people I send it to won’t go install it.
I’m no astrophysicist but it all looks doable with the camera API, canvas API, and WebGL or WebGPU shaders. That actually sounds like a lot of fun.
Yes I was wondering if they used shaders to implement this.
Oh man this so reminds me of the old iPhone apps which were so epic and so cool
As always, wonder what a particular "free" thing is selling and to whom. In this case it's something called BHEX, to NASA.
As Project Scientist for BHEX, I am of course excited about the project and eager to spread the word about it! But as I wrote in my other comment, what this is really trying to "sell" is gravitational physics to students interested in black holes, and this effort is supported in part by the National Science Foundation.
Does anyone else find it jarring to unexpectedly be shown the selfie camera view? Showing both camera feed thumbnails constantly while using this app is a little odd.
Still, kinda fun, reminds me of playing around with different blur / liquidify filters in photoshop back in the day.
Good point. In a future update, we can add a button to show / hide the camera views.
Dad
Another nice feature would be if it could simulate an accretion disk.
That would be cool, but then you wouldn't be seeing the world around you anymore. In other words, at that point it becomes a GRMHD simulation, and there is no point in using cameras since the user's environment is obscured, no? Or did you have something else in mind?
This is not a simulation of a black hole, but rather an image filter that emulates one particular effect.
Yes, agreed. We thought it would be fair to call it a "simulation" of what your surroundings would look like if a black hole were within your FOV, but as you say we do not take into account all effects (time delays in particular would require a lot of buffering and we decided this would be impractical to implement, and not that illuminating).
This is still nice when there are so many artistic images of black holes that do not take such care to use known physics to create an accurate image. Well done all. Looking forward to seeing what BHEX sees.
Glad to hear you're excited about BHEX---we are too!
If you want to read more about what it's going to do, I wrote a blog post about it on the mission website: https://www.blackholeexplorer.org/bhex-blog/lupsasca-stateme...
Read all of it, only question I have is... napkin math, how much more resolution over EHT alone?
About ~5x improvement. Recall that the resolution of an interferometric array is set by the distance between telescopes measured in units of the observation wavelength. BHEX will get a ~3x resolution improvement from the increased distance between the space satellite and our ground telescopes (for the EHT, the max telescope separation is limited by the diameter of the Earth) and another ~50% from the increased frequency of observations (going up from 230 to 320 GHz).
5x is actually a lot: we'll be able to resolve the "photon ring" of orbiting light around M87* and Sgr A* (the two black holes previously imaged by EHT at lower resolution) and likely see the "shadows" of another 6-8 black holes, with the possibility of estimating the mass of another ~20-30 sources.
That's absolutely awesome, thank you.
You're right that the time delays and redshifting wouldn't add much to a toy app, but some of us are here for the physics.
Honestly it's not so far-fetched (to me) that in a few years someone will have GRRMHD simulations running in real time on a portable device.
Are you familiar with A Slower Speed of Light? It's a game which has some nice special-relativistic effects.
http://gamelab.mit.edu/games/a-slower-speed-of-light/
Yes, such a great game---it's a fantastic visualization of special relativity and also fun to play!
I think we're still a ways off from real time GRMHD sims, but CK Chan from UArizona had a working VR simulation (on the Oculus iirc, but now deprecated) that allowed you to explore a pre-existing GRMHD simulation in real time and in 3D. I think he might be working on a new version of this.
That's awesome. It's extra crucial to have engaging outreach when your research is so far from application. There's so much scope for wowing people with astro and if you can enrich our culture and justify funding at the same time that's a win-win.
(Just for clarity the second R in GRRMHD is for radiation. I know it's typical to just push some photons through the GRMHD results to produce renders, bit since I'm dreaming let's treat the radiation self-consistently.)
Hahaha, I half-expected a plasma physicist to complain that GRRMHD is not self-consistent and we need to include non-ideal fluid effects :D
Touché. While we're at it there are almost certainly some non-negligible QED processes. I guess there wouldn't be jobs for physicists if this stuff was straightforward.
I've been able to port NR to GPU's which with sufficiently powerful hardware can run simulations at about ~30fps with raytracing, to simulate binary black hole collisions. You need something around a top end consumer gpu at the moment. Phone hardware needs a while to catch up, there's an absolute minimum memory requirement of ~8gb vram, and you need a lot more bandwidth than they currently support
Awesome! Is it published anywhere? All the stuff I'm familiar with is aimed at old-school clusters.