If you haven’t heard of Nvidia Edify and its 3D modeling capabilities, check it out now.
Widely known as the largest company of its kind on the U.S. stock exchange, Nvidia is best known for its hardware and data center solutions, but its experts have also put together some interesting technologies for consumers.
One of them is the great ability to create 3D images from 2D input or text.
With Edify, you can bring anything to life, including vehicles, cartoon characters, and comfy couches.
The platform is also used in games, virtual reality and augmented reality projects, and even various retail applications.
The company specifically worked with Shutterstock and used some of its data to develop the model.
under the hood
Nvidia Edify uses what’s called a “quad mesh” model, where 3D objects are made up of quadrilaterals or four-sided polygons. This allows for natural edge flow of objects, versatility for subdivision modeling, and rigging for animation.
This process also synthesizes RGB values and surface contours using a diffusion model. Another component is built using “physically-based rendering materials,” which refers to the way surfaces are defined in 3D by taking into account lighting conditions.
This program simulates the effects of natural light on 3D objects using attributes such as base color, roughness, and mapping.
Other products from Nvidia
Experts point out that Nvidia Edify will be deployed through something called Nvidia NIM or Nvidia Inference Microservices. NIM is officially part of the Nvidia AI Enterprise Platform and has a variety of API tools for working with container orchestration and Kubernetes clusters.
In addition to NIM, Nvidia AI Enterprise has GPU acceleration libraries, something called the NGC catalog with tools for deployment, and a “base command manager” that helps handle workload management and container operations.
Enterprise clients can also take advantage of Nvidia network operators, AI workflow tools, and various infrastructure components.
Who is using Edify?
Going back to the consumer side, people are becoming interested in using Edify to digitally assemble 3D objects. The same goes for brands.
In addition to independent gamers and developers of all kinds, businesses are also using Edify to their advantage.
When we did some basic research on what was out there on the web, we found that the platform already had branding adjustments in place.
“Some big-name companies are already using Getty’s AI services to explore new creative horizons in marketing,” writes Ivanna Atti of AI Secrets. “Marketing company WPP and Coca-Cola are working together to create custom visuals that meet brand style and guidelines using Getty Images Generative AI.”
Atti also mentioned Adobe, which is rumored to be using Edify 3D in Adobe Firefly.
There’s more information on the Nvidia blog, and a spokesperson says Mattel also uses Edify to help toy designers visualize their products.
And this is from the Accenture newsroom. “Accenture collaborated with NVIDIA and Jaguar Land Rover (JLR) to enhance the client experience for the Defender vehicle line through advanced generative AI and real-time graphics. Accenture Song leveraged NVIDIA’s Omniverse platform for this effort. and creating a high-fidelity digital twin of the Defender vehicle from computer-aided design data.”
When I looked into it, it seems that the Defender is a car model manufactured by a British brand called Jaguar Land Rover. The Defender was first manufactured in 1948 and officially named in 1990.
Marketing companies are now building high-resolution digital twins to show off these vehicles to the world.
That way you can see what people are doing with this brand new technology. First, early Generative AI had robust diffusion and image creation tools. We now have 3D capabilities and more firepower to create the Metaverse and VR/AR environments of the future.