Meta’s new Make-a-Video AI can generate quick movie clips from text prompts

Meta unveiled its Make-a-Scene text-to-image generation AI in July, which like Dall-E and Midjourney, utilizes machine learning algorithms (and massive databases of scraped online artwork) to create fantastical depictions of written prompts. On Thursday, Meta CEO Mark Zuckerberg revealed Make-a-Scene’s more animated contemporary, Make-a-Video.

As its name implies, Make-a-Video is, “a new AI system that lets people turn text prompts into brief, high-quality video clips,” Zuckerberg wrote in a Meta blog Thursday. Functionally, Video works the same way that Scene does — relying on a mix of natural language processing and generative neural networks to convert non-visual prompts into images — it’s just pulling content in a different format.

“Our intuition is simple: learn what the world looks like and how it is described from paired text-image data, and learn how the world moves from unsupervised video footage,” a team of Meta researchers wrote in a research paper published Thursday morning. Doing so enabled the team to reduce the amount of time needed to train the Video model and eliminate the need for paired text-video data, while preserving “the vastness (diversity in aesthetic, fantastical depictions, etc.) of today’s image generation models.”   

As with most all of Meta’s AI research, Make-a-Video is being released as an open-source project. “We want to be thoughtful about how we build new generative AI systems like this,” Zuckerberg noted. “We are openly sharing this generative AI research and results with the community for their feedback, and will continue to use our responsible AI framework to refine and evolve our approach to this emerging technology.” 

As with seemingly every generative AI that is released, the opportunity for misuse of Make-a-Video is not a small one. To get ahead of any potential nefarious shenanigans, the research team preemptively scrubbed the Make-a-Video training dataset of any NSFW imagery as well as toxic phrasing.     

Amazon is expanding the Astro’s abilities for both home and business

While Amazon is widely known for its Ring brand of doorbell camera home security systems, the company last year introduced a more mobile, and way more adorable, monitoring platform: Astro. The $1,500 automaton (which is currently on sale for $999) essentially serves as an Alexa on wheels, trundling about your home like an AIBO that also manages your calendar and doubles as a guard dog. On Wednesday, Amazon unveiled a slew of new features for Astro, including one that can now detect the presence of your real cat or dog. 

The new feature, which will be available later this year, will trigger while the Astro is “on patrol” around your home. When it encounters your pet, Astro will capture a short video clip of them and share it with you via Live View (part of the Alexa Together system). 

“You can use Live View to tell your dog to get off the couch, or you can take a picture of what they’re doing to add to your pet scrapbook,” Ken Washington, vice president of Consumer Robotics, said during the event. “We think this feature will be especially useful by providing a live connection to your pets so that you have peace of mind about them, no matter where you are.”

Astro is also gaining some added situational awareness. The robot can already map out its patrol routes through your home but, with a new multimodal AI capability, Astro will actively pay attention to “things in your home that you want it to learn about—and better notify you if something isn’t right,” Washington said. Basically, Astro will learn by looking at an object (say, a door) and listening to you speak about it (“that door should always be closed”), then incorporate that information into its monitoring duties. If it detects an issue, the Astro will snap a picture of it and send it to you with a request for further instructions.

For those of you itching to add bespoke features to your own Astro, Amazon is also releasing a new SDK. There’s no word yet on when it will be made publicly available. Washington noted that “to start, we’ll begin working with three of the world’s leading robotics schools later this year—the Georgia Institute of Technology, the University of Maryland, and the University of Michigan—to put an early form of the SDK into their students’ hands.” More official Astro features are in the pipe, Washington assured, and once they’re ready, they’ll be made available as OTA software updates.

Follow all of the news from Amazon’s event right here!

BMW’s next in-vehicle voice assistant will be built from Amazon Alexa

BMW began incorporating smart voice features into its infotainment systems using Amazon’s Alexa in 2018. In the intervening years, the number of models sporting the digital assistant have only increased. At Amazon’s 2022 Devices & Services Event on Wednesday, the two companies announced a deepening of their partnership: BMW’s next-generation of infotainment systems will feature an Alexa-based assistant specifically developed with the driver in mind.

The as-of-yet unnamed BMW assistant will be constructed from an Alexa Custom Assistant, “a comprehensive solution that makes it easy for BMW and other brands and device makers to create their own custom intelligent assistant tailored to their brand personality and customer needs.” Those capabilities might include a proactive notification from the vehicle’s assistant alerting the driver that the battery charge is low while automatically reserving a charging slot at the next off-ramp or preemptively scheduling regular service with the local dealership, and “will enable an even more natural dialogue between driver and vehicle,” per a Wednesday BMW press release.

Amazon’s redesigned Echo Auto will better integrate with your vehicle

Building off of its success convincing the public to outfit their homes and offices with various Alexa-enabled Echo devices, Amazon introduced the very first Echo Auto in 2018. More than a million pre-orders and four years later, the Echo Auto is getting an upgrade, Amazon announced Wednesday at its 2022 Device and Services event.

The new unit will be slimmer than its predecessor and will include a mounting plate that adheres more securely than the last version — so make sure you really like where it’s positioned before taking off the backing film. The unit still leverages five separate mics to pick up commands over road noise so you’ll still have a good amount of flexibility in where you can place it. Once installed, it does what every Alexa does: respond to voice commands. It handles the standard fare of playing music — including a “follow me” function that allows you to switch audio from your home stereo to the vehicle as you get in — as well as navigation and hands-free calls. 

“Ambient technology is at its best in environments where people are focused on other tasks, and nowhere is that more important than in the car,” Heather Zorn, Amazon’s vice president for Alexa said during the event. “Voice can minimize distractions and help you keep your eyes on the road so you can focus on the fun of driving.”

What’s more, with help from Amazon’s cloud the $55 Echo Auto will also be able to alert the driver when their pre-ordered Whole Foods grocery order is ready for pickup will also summon a tow truck if you run out of gas. Simply say, “Alexa, call Roadside Assistance.”

NYU is building an ultrasonic flood sensor network in New York’s Gowanus neighborhood

People made some 760 million trips aboard New York’s subway system last year. Granted, that’s down from around 1.7 trillion trips, pre-pandemic, but still far outpaced the next two largest transit systems — DC’s Metro and the Chicago Transit Authority — combined. So when major storms, like last year’s remnants of Hurricane Ida, nor’easters, heavy downpours or swelling tides swamp New York’s low lying coastal areas and infrastructure, it’s a big deal.

Subway service notice is seen at the 63rd St. and Lexington Avenue early afternoon in Manhattan after remnants of Hurricane Ida caused serious flooding in New York, New Jersey and Pennsylvania, in New York, U.S., September 2, 2021. REUTERS/Jonathan Oatis
Jonathan Oatis / reuters

And it’s a deal that’s only getting bigger thanks to climate change. Sea levels around the city have already risen a foot in the last century with another 8- to 30-inch increase expected by mid century, and up to 75 additional inches by 2100, according to the New York City Panel on Climate Change. To help city planners, emergency responders and everyday citizens alike better prepare for 100-year storms that are increasingly happening every couple, researchers from NYU’s Urban Flooding Group have developed a street-level sensor system that can track rising street tides in real time.

The city of New York is set atop a series of low lying islands and has been subject to the furies of mid-Atlantic hurricanes throughout its history. In 1821, a hurricane reportedly hit directly over the city, flooding streets and wharves with 13-foot swells rising over the course of just one hour; a subsequent Cat I storm in 1893 then scoured all signs of civilization from Hog Island, and a Cat III passed over Long Island, killing 200 and causing major flooding. Things did not improve with the advent of a storm naming convention. Carol in 1954 also caused citywide floods, Donna in ‘60 brought an 11-foot storm surge with her, and Ida in 2021 saw an unprecedented amount of rainfall and subsequent flooding in the region, killing more than 100 people and causing nearly a billion dollars in damages.

NYC floodplains
NOAA

As the NYC Planning Department explains, when it comes to setting building codes, zoning and planning, the city works off of FEMA’s Preliminary Flood Insurance Rate Maps (PFIRMs) to calculate an area’s flood risk. PFIRMs cover the areas where, “flood waters are expected to rise during a flood event that has a 1 percent annual chance of occurring,” sometimes called the 100-year floodplain. As of 2016, some 52 million square feet of NYC coastline falls within that categorization, impacting 400,000 residents — more than than the entire populations of Cleveland, Tampa, or St. Louis. By 2050, that area of effect is expected to double and the probability of 100-year floods occuring could triple, meaning the chances that your home will face significant flooding over the course of a 30-year mortgage would jump from around 26 percent today to nearly 80 percent by mid-century.

NYC 500 year floodplain
NOAA

As such, responding to today’s floods while preparing for worsening events in the future is a critical task for NYC’s administration, requiring coordination between governmental and NGOs at the local, state and federal levels. FloodNet, a program launched first by NYU and expanded with help from CUNY, operates on the hyperlocal level to provide a street-by-street look at flooding throughout a given neighborhood. The program began with NYU’s Urban Flooding Group.

“We are essentially designing, building and deploying low cost sensors to measure street level flooding,” Dr. Andrea Silverman, environmental engineer and Associate Professor at NYU’s Department of Civil and Urban Engineering, told Engadget. “The idea is that it can provide badly needed quantitative data. Before FloodNet, there was no quantitative data on street level flooding, so people didn’t really have a full sense of how often certain locations were flooding — the duration of the floods, the depth, rates of onset and drainage, for example.”

FloodNet inner workings
Urban Flooding Group, NYU

“And these are all pieces of information that are helpful for infrastructure planning, for one, but also for emergency management,” she continued. “So we do have our data available, they send alerts to see folks that are interested, like the National Weather Service and emergency management, to help inform their response.”

FloodNet is currently in early development with just 23 sensor units erected on 8-foot tall posts throughout the Gowanus neighborhood in Brooklyn, though the team hopes to expand that network to more than 500 units citywide within the next half decade. Each FloodNet sensor is a self-contained, solar-powered system that uses ultrasound as an invisible rangefinder — as flood waters rise, the distance between the street surface and the sensor shrinks, calculating the difference between that and baseline readings shows how much the water level has risen. The NYU team opted for an ultrasound-based solution rather than, say LiDAR or RADAR, due to ultrasound tech being slightly less expensive and providing more focused return data, as well as being more accurate and requiring less maintenance than a basic contact water sensor.

The data each sensor produces is transmitted wirelessly using a LoRa transceiver to a gateway hub, which can pull from any sensor within a one-mile radius and push it through the internet to the FloodNet servers. The data is then displayed in real-time on the FloodNet homepage.

Floodnet map of NYC
URban Flooding Group, NYU

”The city has invested a lot in predictive models [estimating] where it would flood with a certain amount of rain, or increase in tide,” Silverman said. Sensors won’t have to be installed on every corner to be most effective, she pointed out. There are “certain locations that are more likely to be flood prone because of topology or because of the sewer network or because of proximity to the coast, for example. And so we use those models to try to get a sense of locations where it may be most flood-prone,” as well as reach out to local residents with first-hand knowledge of likely flood areas.

In order to further roll out the program, the sensors will need to undergo a slight redesign, Silverman noted. “The next version of the sensor, we’re taking what we’ve learned from our current version and making it a bit more manufacturable,” she said. “We’re in the process of testing that and then we’re hoping to start our first manufacturing round, and that’s what’s going to allow us to expand out”.

FloodNet is an open-source venture, so all of the sensor schematics, firmware, maintenance guides and data are freely available on the team’s GitHub page. “Obviously you need to have some sort of technical know-how to be able to build them — it may not be right now where just anyone could go build a sensor, deploy it and be online immediately, in terms of being able to just generate the data, but we’re trying to get there,” Silverman conceded. “Eventually we’d love to get to a place where we can have the designs written up in a way that anyone can approach it.”

NASA successfully smacked its DART spacecraft into an asteroid

After nearly a year in transit, NASA’s experimental Double Asteroid Redirection Test (DART) mission, which sought to answer the questions, “Could you potentially shove a asteroid off its planet-killing trajectory by hitting it with a specially designed satellite? How about several?” has successfully collided with the Dimorphos asteroid. Results and data from the collision are still coming in but NASA ground control confirms that the DART impact vehicle has intercepted the target asteroid. Yes, granted, Dimorphos is roughly the size of an American football stadium but space is both very large and very dark, and both asteroid and spacecraft were moving quite fast at the time.

DART strike
NASA

“It’s been a successful completion of the first part of the world’s first planetary defense test,” NASA Administrator Bill Nelson said after the impact. “I believe it’s going to teach us how one day to protect our own planet from an incoming asteroid. We are showing that planetary defense is a global endeavor and it is very possible to save our planet.”

NASA launched the DART mission in November, 2021 in an effort to explore the use of defensive satellites as a means of planetary defense against Near Earth Objects. The vending machine-sized DART impactor vehicle was travelling at roughly 14,000 MPH when it fatally crossed Dimorphos’ path nearly 68 million miles away from Earth. 

Whether future iterations of a planetary defense system brimming with satellites willing to go all June Bug vs Chrysler Windshield against true planet-killer asteroids remains to be seen. Dimorphos itself is the smaller of a pair of gravitationally-entangled asteroids — its parent rock is more than five times as large — but both are dwarfed by the space rock that hit Earth 66 million years ago, wiping out 75 percent of multicellular life on the planet while gouging out the Gulf of Mexico. 

The Moment of DART's Death
NASA

The DART team will likely be poring over the data generated by both the impactor and cameras released before the spacecraft made its final approach for days to come. However the team will consider shortening the orbital track of Dimorphos around Didymos by 10 minutes an ideal outcome, though any change of at least 73 seconds will still be hailed as a rousing success. The team will have to observe Dimorphos’ orbit for half a day to confirm their success, as the moonlet needs nearly 12 hours to complete an circuit around Didymos.

Update 9/27/2022 2:29 AM ET: The NASA funded ATLAS (Asteroid Terrestrial-impact Last Alert System) managed to record video of the impact (above). While fuzzy, it’s still pretty cool. 

Hitting the Books: How Southeast Asia’s largest bank uses AI to fight financial fraud

Yes, robots are coming to take our jobs. That’s a good thing, we should be happy they are because those jobs they’re taking kinda suck. Do you really want to go back to the days of manually monitoring, flagging and investigating the world’s daily bank …

Tesla to recall more than a million vehicles over pinchy windows

More than a million Tesla owners will have yet another recall notice to deal with in the coming weeks. On Tuesday the National Highway Traffic Safety Administration filed a safety recall notice for numerous late model vehicles from across the EV maker’s lineup because “the window automatic reversal system may not react correctly after detecting an obstruction,” and as such, “a closing window may exert excessive force by pinching a driver or passenger before retracting, increasing the risk of injury,” per the notice.

The following models and years are impacted: 2017-22 Model 3s as well as 2020-21 Model Y, X and S vehicles. Tesla has until mid-November to contact affected owners and plans to push an OTA software update to correct the issue. 

Per the Associated Press, Tesla first identified the issues during product testing in August and has incorporated the update into newly built vehicles since September 13th. However, multiple Twitter users have sounded off in response to Tuesday’s announcement, noting that their vehicles have been having nearly identical issues since at least 2021. 

This is far from Tesla’s first safety recall. Over the last two years alone, Teslas have been recalled on account of overheating infotainment systems, camera and trunk defects, separating front suspensions, their “full self driving” ADAS, their pedestrian warning sounds, their seatbelt chimes, software glitches in their brakes, and sundry touchscreen failures. And that’s just in the US. In Germany this past July, Tesla got popped trying to pass off painted-over frame damage on its Model 3s too.

Hertz to purchase 175,000 General Motors EVs over the next five years

Hertz is once again growing its EV fleet, announcing Tuesday that it has struck a deal with General Motors to purchase 175,000 electric vehicles from the automaker’s Chevrolet, Buick, GMC, Cadillac and BrightDrop brands over the next five years. Customers will see the first offerings, namely the Chevrolet Bolt EV and Bolt EUV, arrive on Hertz lots beginning in the first quarter next year. 

The deal, which runs through 2027, will bring a wide variety of models to Hertz’s growing EV herd. Between now and 2027, the rental company expects its customers to drive about 8 billion miles in said EVs, preventing an estimated 3.5 million metric tons of carbon dioxide from being released. Hertz plans to convert a quarter of its rental fleet to battery electric by 2024. 

This news follows Hertz’s 65,000-vehicle order from Polestar in April, which the performance EV maker has already begun deliveries on. An earlier announcement in 2021 had many believing that Tesla would be supplying the Hertz fleet with 100,000 vehicles, worth an estimated $4.2 billion, was quickly kiboshed by Tesla CEO, Elon Musk. Hertz is already planning to rent 50,000 Tesla EVs to Uber drivers, which now operate in 25 North American cities, there’s no word on whether GM’s vehicles will be offered under similar terms.

For folks who are already in line, having ordered a GM EV and are waiting on delivery, don’t fret. This deal with Hertz shouldn’t impact your existing delivery date. “Our first priority is delivering vehicles to customers holding reservations,” a GM rep told Engadget via email Tuesday. “GM is installing capacity to meet demand from all customers, with annual capacity in North America rising to more than one million units in 2025.”