Archives

Written by Devin Coldewey

An Adidas experiment and whole new exterior facility head to ISS next month

March 2 is the planned launch date for SpaceX’s 20th ISS resupply mission, which is bringing the usual supplies and goodies, plus a payload of interesting experiments from partners and paying customers. And a big expansion to Europe’s Columbus Module.

The most ridiculous has to be Adidas’s “BOOST in Space” effort. The company creates its midsoles by fusing together thousands of tiny foam spheres. But sadly, this is generally done on Earth, where there’s gravity. So of course they want to try it in space to see what they can learn.

“Microgravity enables a closer look at the factors behind pellet motion and location, which could enhance manufacturing processes as well as product performance and comfort,” the project description reads. It also makes for a great stunt. The revelations from this toaster-sized device will surely lead to better shoes.

It’s funny, but as always with these commercial operations it’s pretty cool that it’s possible to just decide to do some experimenting in the ISS.

Microgravity is a sought-after condition and several of the other research projects going up rely on it as well. Another commercial operation is from Delta, the faucet maker, which thinks it might be able to learn something about droplet formation and create more efficient showers and such.

Gut tissue isn’t normally this blue, but that’s not an effect of microgravity.

Emulate is sending up an organ-on-a-chip, intestinal tissue to be precise, which it hopes will help teach us “how microgravity and other potential space travel stressors affect intestine immune cells and susceptibility to infection.” They’re also testing the growth of heart tissue from stem cells up there, which could come in handy on long voyages.

The biggest payload, though, has to be Bartolomeo, a new exterior platform that will attach to the European Columbus Module:

With payloads attached, left, and without, right. There’s a boom that sticks out for other purposes.

It has 12 sites onto which can be attached payloads from commercial and institutional partners — anyone from universities to companies that need access to the exterior of the space station for one reason or another. Earth imagery, vacuum exposure, radiation testing, whatever you like. You can read about the specifications here.

The launch is set for March 2 if all goes well — we’ll post the live stream and any other updates closer to T-0.

How ‘The Mandalorian’ and ILM invisibly reinvented film and TV production

“The Mandalorian” was a pretty good show. On that most people seem to agree. But while a successful live-action Star Wars TV series is important in its own right, the way this particular show was made represents a far greater change, perhaps the most important since the green screen. The cutting edge tech (literally) behind “The Mandalorian” creates a new standard and paradigm for media — and the audience will none the wiser.

What is this magical new technology? It’s an evolution of a technique that’s been in use for nearly a century in one form or another: displaying a live image behind the actors. But the advance is not in the idea but the execution: a confluence of technologies that redefines “virtual production” and will empower a new generation of creators.

As detailed in an extensive report in American Cinematographer Magazine (I’ve been chasing this story for some time but suspected this venerable trade publication would get the drop on me), the production process of “The Mandalorian” is completely unlike any before, and it’s hard to imagine any major film production not using the technology going forward.

“So what the hell is it?” I hear you asking.

Meet “The Volume.”

Formally called Stagecraft, it’s 20 feet tall, 270 degrees around, and 75 feet across — the largest and most sophisticated virtual filmmaking environment yet made. ILM just today publicly released a behind-the-scenes video of the system in use as well as a number of new details about it.

It’s not easy being green

In filmmaking terms, a “volume” generally refers to a space where motion capture and compositing take place. Some volumes are big and built into sets, as you might have seen in behind-the-scenes footage of Marvel or Star Wars movies. Some are smaller, plainer affairs where the motions of the actors behind CG characters play out their roles.

But they generally have one thing in common: They’re static. Giant, bright green, blank expanses.

Does that look like fun to shoot in?

One of the most difficult things for an actor in modern filmmaking is getting into character while surrounded by green walls, foam blocks indicating obstacles to be painted in later, and people with mocap dots on their face and suits with ping-pong balls attached. Not to mention everything has green reflections that need to be lit or colored out.

Advances some time ago (think prequels-era Star Wars) enabled cameras to display a rough pre-visualization of what the final film would look like, instantly substituting CG backgrounds and characters onto monitors. Sure, that helps with composition and camera movement, but the world of the film isn’t there, the way it is with practical sets and on-site shoots.

Practical effects were a deliberate choice for “The Child” (AKA Baby Yoda) as well.

What’s more, because of the limitations in rendering CG content, the movements of the camera are often restricted to a dolly track or a few pre-selected shots for which the content (and lighting, as we’ll see) has been prepared.

This particular volume, called Stagecraft by ILM, the company that put it together, is not static. The background is a set of enormous LED screens such as you might have seen on stage at conferences and concerts. The Stagecraft volume is bigger than any of those — but more importantly, it’s smarter.

See, it’s not enough to just show an image behind the actors. Filmmakers have been doing that with projected backgrounds since the silent era! And that’s fine if you just want to have a fake view out of a studio window or fake a location behind a static shot. The problem arises when you want to do anything more fancy than that, like move the camera. Because when the camera moves, it immediately becomes clear that the background is a flat image.

The innovation in Stagecraft and other, smaller LED walls (the more general term for these backgrounds) is not only that the image shown is generated live in photorealistic 3D by powerful GPUs, but that 3D scene is directly affected by the movements and settings of the camera. If the camera moves to the right, the image alters just as if it was a real scene.

This is remarkably hard to achieve. In order for it to work the camera must send its real-time position and orientation to, essentially, a beast of a gaming PC, since this and other setups like it generally run on the Unreal engine. This must take that movement and render it exactly in the 3D environment, with attendant changes to perspective, lighting, distortion, depth of field and so on — all fast enough so that those changes can be shown on the giant wall a fraction of a second later. After all, if the movement lagged even by a few frames it would be noticeable to even the most naive viewer.

Yet fully half of the scenes in The Mandalorian were shot within Stagecraft, and my guess is no one had any idea. Interior, exterior, alien worlds or spaceship cockpits, all used this giant volume for one purpose or another.

There are innumerable technological advances that have contributed to this; The Mandalorian could not have been made as it was five years ago. The walls weren’t ready; The rendering tech wasn’t ready; The tracking wasn’t ready — nothing was ready. But it’s ready now.

It must be mentioned that Jon Favreau has been a driving force behind this filmmaking method for years now; Films like remake of The Lion King were in some ways tech tryouts for The Mandalorian. Combined with advances made by James Cameron in virtual filmmaking and of course the indefatigable Andy Serkis’s work in motion capture, this kind of production is only just now becoming realistic due to a confluence of circumstances.

Not just for SFX

Of course Stagecraft is probably also the most expensive and complex production environments ever used. But what it adds in technological overhead (and there’s a lot) it more than pays back in all kinds of benefits.

For one thing, it nearly eliminates on-location shooting, which is phenomenally expensive and time-consuming. Instead of going to Tunisia to get those wide-open desert shots, you can build a sandy set and put a photorealistic desert behind the actors. You can even combine these ideas for the best of both worlds: Send a team to scout locations in Tunisia and capture them in high-definition 3D to be used as a virtual background.

This last option produces an amazing secondary benefit: Reshoots are way easier. If you filmed at a bar in Santa Monica and changes to the dialogue mean you have to shoot the scene over again, no need to wrangle permits and painstakingly light the bar again. Instead, the first time you’re there, you carefully capture the whole scene with the exact lighting and props you had there the first time and use that as a virtual background for the reshoots.

The fact that many effects and backgrounds can be rendered ahead of time and shot in-camera rather than composited in later saves a lot of time and money. It also streamlines the creative process, with decisions able to be made on the spot by the filmmakers and actors, since the volume is reactive to their needs, not vice versa.

Lighting is another thing that is vastly simplified, in some ways at least, by something like Stagecraft. The bright LED wall can provide a ton of illumination, and because it actually represents the scene, that illumination is accurate to the needs of that scene. A red-lit interior of a space station, and the usual falling sparks and so on, shows red on the faces and of course the highly reflective helmet of the Mandalorian himself. Yet the team can also tweak it, for instance sticking a bright white line high on the LED wall out of sight of the camera but which creates a pleasing highlight on the helmet.

Naturally there are some trade-offs. At 20 feet tall, the volume is large but not so large that wide shots won’t capture the top of it, above which you’d see cameras and a different type of LED (the ceiling is also a display, though not as powerful). This necessitates some rotoscoping and post-production, or limits the angles and lenses one can shoot with — but that’s true of any soundstage or volume.

A shot like this would need a little massaging in post, obviously.

The size of the LEDs, that is of the pixels themselves, also limits how close the camera can get to them, and of course you can’t zoom in on an object for closer inspection. If you’re not careful you’ll end up with Moiré patterns, those stripes you often see on images of screens.

Stagecraft is not the first application of LED walls — they’ve been used for years at smaller scales — but it is certainly by far the most high-profile and The Mandalorian is the first real demonstration of what’s possible using this technology. And believe me, it’s not a one-off.

I’ve been told that nearly every production house is building or experimenting with LED walls of various sizes and types — the benefits are that obvious. TV productions can save money but look just as good. Movies can be shot on more flexible schedules. Actors who hate working in front of green screens may find this more palatable. And you better believe commercials are going to find a way to use these as well.

In short, a few years from now it’s going to be uncommon to find a production that doesn’t use an LED wall in some form or another. This is the new standard.

This is only a general overview of the technology that ILM, Disney, and their many partners and suppliers are working on. In a follow-up article I’ll be sharing more detailed technical information directly from the production team and technologists who created Stagecraft and its attendant systems.

Astronomers warn of ‘worrisome’ light pollution from satellite constellations

The International Astronomical Union has issued the preliminary results from a study on the potential effects of multi-thousand satellite constellations like that being built by StarLink. Finding that Earth-based astronomical observations may be “severely affected,” the body warned that mitigations and rules had better be formed sooner rather than later.

The group expressed its concerns last summer, but undertook a broader study and survey of possible effects, asking various observatories and organizations to chime in. The general feeling is one of “hope for the best, but prepare for the worst.”

According to the IAU’s estimates, once there are tens of thousands of satellites in low Earth orbit, somewhere around 1,500 will be above the horizon at any given time, though fewer (250-300) would be more than 30 degrees above it, in the area usually observed by astronomers.

“The vast majority” will be too faint to be seen by the naked eye except during specific periods when the sun’s light is more likely to reflect off their surfaces — in the early hours of darkness, generally. Measures are being taken to reduce the visibility and reflectivity of these supernumerous satellites, but we won’t be sure how effective they are until they’re up there, at which point of course it is too late to do anything about it.

More “worrisome,” as the IAU puts it, is the potential effect on wide-field observations like the Large Synoptic Survey Telescope’s (lately renamed the Rubin Observatory). Almost a third of 30-second exposures done by such telescopes could be affected by satellites overhead, which will be far more visible to their sensitive instruments.

There may be ways around this, but it’s hard not to read a sense of frustration into the IAU’s statement:

In theory, the effects of the new satellites could be mitigated by accurately predicting their orbits and interrupting observations, when necessary, during their passage. Data processing could then be used to further “clean” the resulting images. However, the large number of trails could create significant and complicated overheads to the scheduling and operation of astronomical observations.

In other words, if the operators of these constellations refuse to do anything about it, there are at least things we can do. But they won’t be without cost or drawbacks.

This is all strictly relating to visible light issues; Possible interference with observations of radio-frequency and other invisible radiation due to the transmissions of these constellations is still something of an unknown.

Ultimately, though the IAU’s statement is careful to maintain a veneer of neutrality, it’s clear they’re all rather put out.

“A great deal of attention is also being given to the protection of the uncontaminated view of the night sky from dark places, which should be considered a non-renounceable world human heritage,” they write. “There are no internationally agreed rules or guidelines on the brightness of orbiting man-made objects. While until now this was not considered a priority topic, it is now becoming increasingly relevant. Therefore the IAU will regularly present its findings at the meetings of the UN Committee for Peaceful Uses of Outer Space, bringing the attention of the world government representatives to the threats posed by any new space initiative on astronomy and science in general.”

In other words, they’re not going to quietly sit in their observatories and let a handful of companies clutter up the night sky.