Tech,Space,Gaming, and Science Fiction News to wet your whistle
AI gives Thanos a soul in 'Avengers: Infinity War'
Get link
Facebook
X
Pinterest
Email
Other Apps
Spoilers ahead for Avengers: Infinity War.
Thanos isn't your usual Marvel nemesis. Then again, even after 19 films in Disney's superhero universe, it's not as if he's had much strong competition. Aside from the puckish Loki and tragic Killmonger, most Marvel villains have been pretty forgettable. Now, after years of build up (we first caught a glimpse of Thanos in 2012's The Avengers) he finally took center stage in this summer's Avengers: Infinity War.
But what's most intriguing about Thanos isn't that he wants to wipe out half of life across the universe — instead, it's that he's a big purple alien who feels genuine emotion. He cries when he's forced to sacrifice Gamora, his adopted daughter. He feels pain and anguish. But like many memorable bad guys, he believes it's all for the greater good.
Sharp eyed viewers will noticed Thanos looks very different in Infinity War than he did in the Avengers post-credits scene. That's not just due to the advanced in CG technology. "We all came to the conclusion that the performance would actually come through a little bit better if we introduced a little bit more of some of the recognizable features of Josh Brolin," said Kelly Port from the VFX company Digital Domain, one of many effects firms working on the film.
Digital Domain also used a piece of custom machine learning software called Masquerade to make the motion capture performance seem more natural. The process starts by placing 100 to 150 tracking dots on Brolin's face, which is captured by two vertically oriented HD cameras. It's not meant to be a high-resolution scan, instead it's a fairly low-quality rendering. That's fed into a machine learning algorithm that's using a library of high-res face scans, across a wide variety of expressions.
Marvel Studios
"[Masquerade] takes that low resolution mesh and it figures out what high resolution shape face would be the best solution for that," Port said. "Then it gives you a solution, and then we would look at that result. If it didn't feel quite right, we would make a little tweak in modeling to adjust ... let's say this has more lip compression or the brows need to be higher, we feed that back into the system and it would learn from that via a machine learning algorithm."
The next time Digital Domain puts the low-res mesh through its system, it should have a better result than before. But that's just step one. Next up is a process called direct drive, which takes the high-resolution face mask performance and places it on Thanos's character model.
"And then we kind of go through a similar process in that we look at them side by side," Port said. "We look at Josh's performance and it's like, 'He's more surprised,' or 'He's more sad here," and there's something in Josh's performance that's not coming across in Thanos that we'd make a little adjustment. We'd feed that back through the system, and then hopefully the next time that same input gets fed into the direct drive system, the result would be more accurate or more what we desired."
Marvel Studios
Without a machine learning system like Masquerade, VFX artists have to tweak facial performances with animation manually, a process that can be more time consuming. Still, there are other modern techniques, like WETA's Oscar-winning FACETS, which was used for facial tracking on Avatar and the recent Planet of the Apes trilogy.
"We knew going in Thanos has to work, or the movie doesn't work," said Dan Deleeuw, Marvel Studio's VFX supervisor. So from the start, his team was focused on understanding him as a character. Based on the glimpses of him we've seen before in Marvel films, they knew he'd be a large and angry character — a giant who's literally railing against the universe. But Marvel also wanted to capture subtle aspects of Josh Brolin's performance, especially his face.
Marvel Studios
The first day on set, directors Joe and Anthony Russo wasted no time getting Brolin in a motion capture helmet and suit to test out some of his lines. But they also went a step further. "Instead of cutting when they stop doing the lines, we just kept the motion capture going," Deleeuw said. "We kept when he was just experimenting with the different lines and how he would approach Thanos."
Using those off-the-cuff line takes, Marvel Studios was able to capture nuances that Deleeuw didn't originally plan for. "Just being able to read almost imperceptible movements in his face... movements in his eyes and his cheeks, and then you know later on to show his frustration or sadness with Gamora, or his anger with Tony... just really bring a character like that to the screen, I think was one of the biggest challenges," he said.
"Doug Roble, the guy that's working on that [Digital Domain] software said something along the lines of, 'If you're not using machine learning in your software, you're doing it wrong,'" Deleeuw said, recounting a recent visit to the VFX company. Looking ahead, the technology will be used for more than just faces -- it could help with things like water simulations. Eventually, you can expect machine learning to have a role just about everywhere when it comes to visual effects.
By Liam McCabe This post was done in partnership with Wirecutter . When readers choose to buy Wirecutter's independently chosen editorial picks, it may earn affiliate commissions that support its work. Read the full article here . After six summers of researching, testing, and recommending window air conditioners, we've learned that quiet and affordable ACs make most people the happiest—and we think the LG LW8016ER will fit the bill in most rooms. This 8,000 Btu unit cools as efficiently and effectively as any model with an equal Btu rating, and runs at a lower volume and deeper pitch than others at this price. Little extra features like a fresh-air vent, two-axis fan blades, and a removable drain plug help set it apart, too. The LG LW8016ER is a top choice for an office or den, and some people will find it quiet enough for a bedroom, too. If our main pic...
Pre-loaded cartridges of cannabis concentrate are currently among the most popular means of consumption, and for good reason. They're discreet to use and easy to handle, a far cry from the dark days of 2016 when we had to dribble hash oil or load wax into narrow-mouthed vape pens by hand. But, frustratingly, an ever increasing number of oil cartridge manufacturers employ one-off design standards so that their products won't work with those of their competitors, thereby locking customers into proprietary ecosystems. We've already seen this with nicotine vaporizers -- which has a seen a massive rise in "pod systems" in the last few years, each outfitted with a unique canister and battery built to be incompatible with those of their competition. Is it too late for the burgeoning cannabis industry to set a universal standard for their product designs? ...
Ever since cloning produced Dolly the sheep , scientists have copied a slew of mammals ranging from dogs to ponies. Primates, however, have been elusive -- until now. Chinese researchers have successfully cloned a macaque monkey fetus twice, producing sister monkeys Hua Hua and Zhong Zhong using the same basic method used to create Dolly. The team removed the nucleus from monkey eggs and replaced it with DNA from the fetus, implanting the resulting eggs in female monkeys for them to give birth. The process wasn't easy. It took 127 eggs and 79 embryos to get these results, and it still required a fetus to work (Dolly was cloned from an adult). Still, it reflects progress in cloning science. The team managed the feat by injecting both a form of mRNA and an inhibitor, the combination of which improved the development of blastocysts (the structures that form the embryo) and the pregnancy rate for transplanted embryos. Both baby macaques are healthy, the researchers said, and genet...
Comments
Post a Comment