“Acoustic holograms” quickly assemble objects from particles or cells

Scientists have created “acoustic holograms” that can assemble matter into 3D objects using only sound. The technique works with different types of particles and even living cells, enabling a new method of fast, non-contact 3D printing.

Sound exists as pressure waves moving through a medium such as air or water. These waves can exert pressure on the surfaces they hit, although this force is so weak that we usually only notice it on our eardrums. However, scientists have experimented with manipulating high-frequency ultrasound to levitate small objects, create complex soundscapes, or add a sense of touch to visible holograms.

For the new study, scientists from Max Planck and the University of Heidelberg investigated a new use for ultrasound: moving tiny building blocks in precise ways to put 3D objects together. They used specially designed 3D printed panels to create a specific sound field. By combining several of these panels with different designs, an acoustic hologram can be created in a specific 3D shape.

It works like an invisible mold – when this ultrasonic hologram is applied to particles suspended in liquid, pressure waves are applied in different areas with different intensities until the particles coalesce into the precise 3D shape desired. In tests, the team was able to create shapes like a dove, figure 8, and spiral using materials like glass beads, hydrogel, and even biological cells.

There are a few potential advantages of the technique. It can be faster and more efficient because it works in one step, unlike traditional 3D printing which builds an object layer by layer. And because the particles don’t need to be physically touched, they’re kinder to biological cells, which could make them perfect for making tissues and organs.

“This can be very useful for bioprinting,” says study author Peer Fischer. “The cells used there are particularly sensitive to the environment during the process.”

The team says future work could explore ways to improve the technique, including using more hologram plates, higher ultrasonic frequencies and other materials.

The research was published in the journal scientists progress.

Source: Max Planck Institute

Source: newatlas.com

#Acoustic #holograms #quickly #assemble #objects #particles #cells

The iPhone 15 would come with a new camera bump

The official unveiling of the iPhone 15 should be in seven months, but in the meantime there are plenty of leaks and rumors to digest, including one hinting at a “new camera bump” for the iPhone. 15 and the iPhone 15 Plus.

This is from the tipster ShrimpApplePro (opens in a new tab) (via GSMArena (opens in a new tab)) on Twitter. The claim was made in response to a leaked spec summary of the iPhone 15 and iPhone 15 Plus circulating up to this point.

No other details are included though – so it’s unclear exactly how the camera bump will differ on the iPhone 14 and iPhone 14 Plus successors. Presumably, some sort of overhaul will be involved.

Add a pinch of salt

It’s worth noting that the mention of a new camera bump is preceded by the line “Don’t quote me on that, but” – suggesting that this might not be the most reliable rumor in terms the quality of the information or the certainty that it will occur.

However, ShrimpApplePro is not new to the phone leak game. Although we have received inaccurate information from this source in the past, there have also been correct predictions. One such correct prediction is the battery size of the iPhone 14 Pro Max.

So, as of this writing, it’s definitely thanks to the redesigned camera for the iPhone 15 and iPhone 15 Plus. It will be interesting to see if we hear any more rumors along these lines by September.

Analysis: let the speculation begin

In terms of appearance, the camera bump on the back of the iPhone 14 is identical to that of the iPhone 13, which again has only a slight deviation (in terms of the positioning of the lens) compared to what we got on the iPhone 12 and iPhone 11.

So it’s safe to say that we haven’t seen a major camera overhaul for the iPhone since 2019, which is quite a stretch. This further leads us to believe that this is an accurate leak and that Apple is really going to change things up this year.

There’s already been speculation on Twitter about what the “new camera bump” reference might mean. Maybe additional lenses will be added beyond the two 12MP modules on the current model, or maybe the bump will get bigger or more pronounced.

Apple could even follow the lead of the Samsung Galaxy S23 and separate the camera lenses on the back of the phone with no visible casing around them. So far, we haven’t seen any leaked images of the iPhone 15.

Source: www.techradar.com


#iPhone #camera #bump

NASA’s Lunar Flashlight satellite will not reach its planned orbit

This week there was good news for one NASA lunar mission as the CAPSTONE satellite recovered from a communications problem, but bad news for another. The Lunar Flashlight mission, designed to search for water ice at the moon’s south pole, will no longer be able to reach the planned orbit.

This illustration shows NASA’s lunar flashlight performing a course correction maneuver with the Moon and Earth in the background. NASA/JPL-Caltech

The lunar flashlight, a small satellite called CubeSat, was launched last December but quickly ran into trouble on its journey. Three of its four engines malfunctioned, making it difficult for the satellite to perform the maneuvers necessary to enter its intended lunar orbit.

NASA explained in an update that the team at NASA’s Jet Propulsion Laboratory and Georgia Tech attempted to fix the problem by spinning the spacecraft and firing the only working propellant into what they hoped would be. 10-minute bursts, they would nudge the spacecraft in the required direction. But after several attempts, this motor also stopped working.

The spacecraft will almost certainly not reach its predicted near-straight-line halo orbit now. However, all is not lost. The team is working on a plan to salvage as much of the mission as possible by placing the satellite in high Earth orbit, which would allow it to fly past the moon and give it the opportunity to collect data from the moon’s south pole.

The satellite has limited fuel after trying to return it to its original orbit, but the team will try to begin maneuvers this week that could allow it to make its first scientific flight above the Moon in June.

NASA was philosophical in announcing the issue, pointing out that Lunar Flashlight was a technology demonstration with a new, miniaturized propulsion system — meaning it was essentially a test of a new concept. “Technology demonstrations are high-risk, high-reward ventures aimed at pushing the frontiers of space technology,” the agency wrote in the announcement. “Lessons learned from these challenges will help inform future missions that continue to advance this technology.”

Today’s tech news, curated and condensed for your inbox

Check your inbox!

Please provide a valid email address to continue.

This email address is currently registered. If you do not receive any newsletters, please check your spam folder.

Sorry, an error occurred while registering. Please try again later.

Editor’s Recommendations

Source: www.digitaltrends.com

#NASAs #Lunar #Flashlight #satellite #reach #planned #orbit

A quantum explanation of gravity could give the theory of everything

  • Physicists say understanding gravity requires a quantum mechanical explanation.
  • However, there is no direct evidence for hypothetical quantum gravity particles called gravitons.
  • Experimenters hope to discover the effects of gravitons within ten years.

To our knowledge, our physical world is governed by four fundamental forces: electromagnetism, weak and strong nuclear forces, and gravity. Besides playing with bar magnets or marveling at the light of a rainbow, gravity is what we know best here on earth. Yet, it’s actually the least understood strength of the group.

Our understanding of gravity has undergone a number of renovations over the past hundred years – from Newton’s interpretation of the motions of planets and apples to Einstein’s theory of general relativity and spacetime. But for physicists like Catherine Zureka professor of theoretical physics at Caltech whose work focuses on dark matter as well as the observational signals of quantum gravity, that’s still not enough.

She’s not the only one. Theorists and experimentalists around the world have been working for decades to write a so-called “theory of everything” that would unify quantum explanations of the very small with classical physics of the very large (like humans and planets). A testable theory of quantum gravity is at the heart of this quest for a single theory that explains everything in our universe.

More Popular Mechanics

“For many reasons, we believe that the fundamental understanding of gravity should be quantum mechanical in nature,” says Zurek Popular mechanics. “So we have to figure out how to make these basic principles of quantum mechanics [work for] heaviness. It’s quantum gravity – classical gravity proven by quantum mechanics.

Zurek is part of a Caltech and Fermilab joint crew who is currently developing a new kind of experiment called Gravity from Quantum Entanglement of Space-Time (GQuEST) that will look for gravity-like fluctuations by looking for observable effects on photons.

What is Quantum Gravity?

Scientists are pretty confident that a quantum explanation of gravity should exist, but finding a theory to support that belief — let alone prove it to be correct — has been much more difficult, Zurek says.

In the standard model of particle physics, a model that explains all fundamental forces except gravity, the forces are carried by specialized particles. For example, the electromagnetic force is transmitted by photons, which can be felt as light. Following this logic, physicists suggested that gravity should also have its own particle, which physicists dubbed the “graviton.” However, trying to fit a graviton into the picture using existing math has led scientists into a tangle of impossible math, such as B. Equations ending in infinity.

Physicists are considering a number of theories to solve this problem, but Zurek says string theory remains the best description so far.

Physicists originally proposed string theory in the late 1960s, and it can take on many different flavors. The general idea is that the universe is made up of ten dimensions (or sometimes more) – only four of which make up space and time as we know them. The remaining dimensions are a kind of invisible frame. In this multidimensional model, very small objects called “strings” replace the particles. These strings, like plucked guitar strings, vibrate at different frequencies based on different fundamental particles. Scientists suggest that such a frequency should be attributed to the theoretical graviton.

✨Let’s think in higher dimensions✨
A physicist reveals what the fourth dimension looks like
Here’s a cool way to visualize higher dimensions
How to Cut a Four-Dimensional Cake, According to Math

One of the most startling conclusions we can draw from string theory is that gravity may not even be strictly “real.”

In other words, gravity – and even spacetime – can only be emergent properties produced by the quantum entanglement of particles. Netta Engelhardttheoretical physicist at the Massachusetts Institute of Technology, said Espace.com that this phenomenon is similar to the sensation of heat, which is in fact only the experience of our body of the speed of the air molecules which surround us.

David Wall//Getty Images

All this is purely theoretical for the moment. Although string theory has proven itself in many ways, including providing an integrated and elegant description of gravity, there are still many questions it doesn’t answer, Zurek says. For example, string theory cannot yet incorporate an existing understanding of the Standard Model.

“It is believed that if we understand string theory well enough, we will understand how to integrate Standard Model matter into this theoretical structure of quantum gravity, but it is unclear how to do this. [yet].”

Find physical evidence for quantum gravity

Xuanyu Han//Getty Images

Zurek’s work does not attempt to confirm or refute string theory, but it does East looking for ways to bring the quest for quantum gravity into the physical world. The basic design of GQuEST is a desktop version of the Laser Interferometer Gravitational-wave Observatory (LIGO) gravitational wave detector.

With incredibly precise measurements, the researchers look for small fluctuations in the path of photons as they pass between mirrors. These disturbances can be the effect of gravitons. Researchers hope to observe such effects within the next five to ten years.

“We believe that with this type of measurement, we may be able to see the quantum nature of gravity in this type of experiment for the first time,” Zurek says. “From this perspective, it will be a big step forward in our understanding of how quantum mechanics and gravity come together.”

Sarah is a Boston-based science and technology writer interested in how innovation and research intersect with our daily lives. She has written for a number of national publications and covers innovation news at the opposite.

Source: www.popularmechanics.com

#quantum #explanation #gravity #give #theory

SpaceX blocks Ukraine from using Starlink service to control drones

What just happened SpaceX’s close relationship with Ukraine could be strained after the company restricted the country’s ability to use the Starlink satellite service for offensive military purposes. The move follows reports that Ukraine used Starlink to control drones.

SpaceX has shipped more than 25,000 Starlink terminals to Ukraine and serviced them since the war began, helping to keep the country’s critical infrastructure and its citizens online as Russia continues its assault.

But Ukraine reportedly used Starlink in its offensive push against the Russian military, including to target enemies with drones, in violation of SpaceX policies.

Gwynne Shotwell, president and chief operating officer of SpaceX, told a conference in Washington, DC on Wednesday (via Reuters) that Starlink should never be used as a weapon.

“However, the Ukrainians used it in a way that was unintentional and not part of any agreement,” she said, referring to reports that Starlink was used to control the Ukrainian drones. “There are things we can do to limit their ability to do that. [controlling the drones]’ she said, ‘there are things we can do and have done.’

Shotwell never disclosed what steps SpaceX took to prevent Ukraine from using Starlink for military attacks. She pointed out that the service could be used for military communications but was never intended for offensive purposes.

In October, SpaceX CEO Elon Musk said the company could not fund Starlink in Ukraine indefinitely, despite other governments sharing the cost of equipment and maintenance. The company estimated costs could reach nearly $380 million over the next 12 months and wanted the US government to pay for the additional terminals and ongoing service costs.

It was only a few days before Musk backed down and promised that SpaceX would be to fund Ukraine “indefinitely,” although the billionaire recently tweeted that any course of action toward Ukraine would draw criticism: “Damn if you do, damn if you don’t,” he said. -he writes.

“SpaceX Starlink has become the backbone of Ukraine’s connectivity to the front lines. That’s the damn part,” Musk wrote in a separate tweet. “However, we don’t allow Starlink to be used for long-range drone strikes. It’s the fucking thing if you don’t break.

Russia warned last year that SpaceX satellites could become a “legitimate target”. The country attempted to jam Starlink signals in Ukraine, which Musk says led the company to improve the security of the service’s software.

Musk was previously embroiled in a war of words with the former head of Russia’s Roscosmos space agency, leading to a tweet from the Tesla boss about his death under mysterious circumstances.

Source: www.techspot.com


#SpaceX #blocks #Ukraine #Starlink #service #control #drones

With REVOPOINT’s new affordable handheld scanner, you can easily convert large objects into precise 3D models

The Revopoint RANGE uses the company’s new infrared structured light projector, which captures a large area of ​​360mm x 650mm (at a distance of 600mm), allowing you to scan objects the size of a car whole or in one piece in just a few minutes. Functioning as both a stationary and handheld scanning device, the RANGE records both 3D and color data to provide an accurate model with an accuracy of up to 0.1mm, as well as color information precise. The models you scan can then be used in the metaverse, for reverse engineering and modeling, or even for 3D printing, covering a variety of applications in a variety of industries.

Designer: Revopoint Design

Click here to buy now: $474 $729 ($729 off). Hurry, more than 184/700! Over $1,300,000 raised.

The advantage of the Revopoint RANGE lies entirely in its name. The device has a scanning area of ​​360mm x 650mm (that’s almost a 1.2ft x 2ft bounding box) so you can cover large areas in less time. The RANGE also uses a state-of-the-art dual camera system with a projector that emits invisible infrared light to enable scans with even higher precision than scanners that use lasers or blue light. Because the light projector emits invisible light, the RANGE is also perfect for scanning people and animals. Hold the Revopoint RANGE up to 300mm (12″) or up to 800mm (31″) from your subject, and its ability to run as fast as 12-18 FPS means you can capture an entire human in less than 2 minutes to scan. or interior tuning in minutes, complete with color information and accurate to within 0.1mm. With RANGE you can also scan transparent and reflective objects with a scanning spray that creates an opaque film on the object and then disappears completely in just a few hours.

The Future of Interior Design – Deliver stunning interior design customizations by scanning furniture, fixtures, or even entire rooms and creating colorful, vibrant 3D models for use in interior design software like DreamPlan or MagicPlan.

Automatic modeling made easy – You can quickly capture and measure precise dimensions even when digitizing complex geometric surfaces.

Capture History – Its portable scanning mode, light weight, and compatibility with iOS and Android devices make it a portable and versatile tool for capturing great works of art without having to move or touch them.

Head to Toe in Under 2 Minutes – Simplify the creation of 3D human models for video games, AR and VR applications with a full body and head scan in under two minutes.

Accurate 3D models – With an image accuracy of up to 0.1 mm and a point spacing of up to 0.3 mm, the new RANGE dual infrared cameras with aspherical lenses reduce image errors and ensure that the microstructured infrared light reaches the sensors evenly.

It’s a colorful world – RANGE’s RGB camera can capture the color of an object as it is scanned and, after data meshing, can be merged with the 3D model to create 3D models precise, in color and almost realistic.

These scans can then be used in a range of industries. High precision models are ideal for reverse engineering, modification and upgrading. Transportation designers can use these models to run simulations, improve designs, and perform wind tunnel testing, while garages can use these models to create custom fairings and fairings. Interior designers could create a full 3D bench of furniture and decorations, which can then be placed in virtual rooms to understand what they look like. Models can be modified, changing colors, textures and materials. The color information captured by the scanner also proves invaluable to animators, who can directly scan and manipulate people without spending hours and days painting over layers of skin, hair, eyes, clothing. , etc. Output models can even be 3D printed. either in single material filament or in full color!

Scans created with the oscilloscope

The Revopoint RANGE is designed to be universal, versatile, yet affordable. It can be used indoors or outdoors, and in portable or free-standing formats. Using a stabilizer with the RANGE provides better control and faster scanning, while Revopoint also provides large turntables for large objects weighing up to 200 kg (441 lb). The RANGE starts at just $328 (thanks to a 55% discount for super early bird supporters) and rivals their popular POP2 and MINI models, which had an average starting price of $799. The Revopoint RANGE is compatible with Windows 81011 (64-bit), Android, iOS and Mac devices and only requires a USB cable for power and data transfer.

Click here to buy now: $474 $729 ($729 off). Hurry, more than 184/700! Raised over $1,300,000.

Source: www.yankodesign.com

#REVOPOINTs #affordable #handheld #scanner #easily #convert #large #objects #precise #models

Former Twitter executives tell House committee deleting story from Hunter Biden’s laptop was a ‘mistake’

Former Twitter executives told a House committee on Wednesday that the social media company erred in addressing a controversial New York Post article about Hunter Biden’s laptop.

The social media platform’s action just weeks before the 2020 election sparked a flurry of backlash from Republicans, who accused Twitter executives of deleting the story to protect the President Joe Biden and his family about what they think after malicious material on a laptop hard drive belonged to the president’s son.

At a House Oversight and Accountability Committee hearing, Republicans asked the three leaders about the company’s decision to block users from sharing young Biden’s story and suggested that the giant social media, on government orders, took action when it deleted the story.

“America has witnessed a coordinated campaign by social media companies, mainstream media, and the intelligence community to suppress and delegitimize the existence of Hunter Biden’s laptop and its contents,” he said. said James Comer, R-Ky., Chairman of the Oversight Committee, in his opening remarks.

Former Twitter employees called the platform’s decision on the story a “mistake”, but denied acting in concert with government officials.

“It was clear to me that, in my judgment at the time, Twitter should not have taken any action to block New York Post reporting,” said Yoel Roth, former chief security and integrity officer. He said the company made the decision because the story of the Biden laptop is reminiscent of the 2016 Russian hack of the Democratic National Committee.

Vijaya Gadde, Twitter’s former chief legal officer, echoed Roth saying that Twitter admitted “its initial action was wrong” and changed its policy within 24 hours.

Vijaya Gadde, former legal director of Twitter, speaks during a House Oversight and Accountability Committee hearing on Twitter’s handling of a 2020 New York Post article about Hunter Biden and his laptop on 8 February 2023 in Washington, DC, excl.

Evelyn Hockstein/Reuters

“The New York Post chose not to remove its original tweets, so after two weeks Twitter made an exception to retroactively apply the new policy to Post tweets,” Gadde said. “In hindsight, Twitter should have reactivated the Post account immediately.”

During the hearing, Roth also said that Twitter’s relationship with government employees would benefit from greater transparency.

“Transparency is at the heart of this work, and that’s where I think Twitter — and all social media — can and should improve,” Roth said. “Trust is built on understanding, and right now the vast majority of people don’t understand how or why content moderation decisions are made.”

Republicans have accused former Twitter executives of being “afraid” Joe Biden won’t win the 2020 election and working with the FBI.

“You were given supreme power over Twitter, but when you were confronted with the New York Post story, you rushed to come up with a reason why the American people shouldn’t see it, rather than allowing people to see it. people to read the information and judge it for yourself”. Comer said. “Within hours, you have decided the truth about a story that spans years and dozens of complex international transactions. You did this because you feared Joe Biden would win the 2020 election.”

R-Ohio Jim Jordan accused Twitter executives of having “weekly meetings” with the FBI and accused the executives of colluding with the agency to remove the New York Post article.

“I think you wanted him out,” Jordan said. “They send you all kinds of emails… I think you wanted to put it out. I think the FBI tricked you.

Rep. Lauren Boebert, R-Colo., who was briefly suspended from Twitter in 2021 for tweeting the false claim that the 2020 presidential election was stolen, also accused former Twitter employees of “collaborating with the FBI”.

“I am angry at the millions of Americans who have been silenced because of your decisions, because of your actions, because of your collusion with the federal government,” Boebert said.

“We don’t know where the FBI ends and where Twitter begins,” Boebert said.

But Roth denied the allegations, telling the committee that the FBI had not told Twitter that the laptop’s hard drive was fake or hacked.

Former Twitter assistant attorney James Baker, who was fired by new Twitter CEO Elon Musk in December, also said he was not in contact with the FBI about the decision. company to delete the article.

At one point during the hearing, Rep. Gerry Connolly, D-Va., questioned former Twitter employee Anika Collier Navaroli about a Trump White House request for Twitter to remove a tweet from celebrity Chrissy Teigen, who was president during the Donald era. Trump offended.

“The White House contacted Twitter almost immediately afterwards to request that the tweet be removed. Is that correct?” asked Connolly Navaroli.

“I remember hearing that we got a request from the White House to make sure we evaluate this tweet and they wanted it taken down because it was a derogatory statement to the president” , replied Navaroli.

Hunter Biden walks toward a vehicle after disembarking from Air Force One with his father, President Joe Biden, at Hancock Field Air National Guard Base in Syracuse, New York, Feb. 4, 2023.

Elizabeth Frantz/Reuters

Twitter did not remove the tweet, she said.

During the hearing, Rep. Jamie Raskin, D-Md., said Republicans were focused on a “two-year-old story” about a private company being allowed to make decisions about what content to allow on its platform. -form.

“The key point here is that it was Twitter’s decision,” Raskin said in his opening statement. “Twitter is a private media company. In America, private media companies can decide what gets published. »

“Rather than drop this meaningless lawsuit, my colleagues attempted to foment a fake scandal over this two-day shortcoming in their ability to deliver Hunter Biden propaganda on a private media platform,” Raskin said of the hearing. “Silly doesn’t even begin to capture this obsession.”

Democrats instead focused on how the social media platform may have helped fuel violence in the US Capitol on January 6, 2021.

“What makes this hearing tragic is that if our colleagues really wanted to look at a serious issue affecting American democracy and social media, my friends, that’s pretty obvious to us,” Raskin said.

Navaroli also argued that lawmakers should focus on “Twitter’s inability to act before Jan. 6.”

“Twitter executives gave in and broke their own rules to protect some of the most dangerous speech on the platform,” Navaroli said in the months leading up to the Jan. 6 riot on Capitol Hill.

Source: abcnews.go.com

#Twitter #executives #House #committee #deleting #story #Hunter #Bidens #laptop #mistake

Google Bard explained what this AI-powered ChatGPT competitor can do

There’s been a lot of talk about artificial intelligence lately, especially after OpenAI unveiled its revolutionary ChatGPT service, which Microsoft is now looking to integrate with Office 365 and Bing Search. Meanwhile, Google has been a self-proclaimed “AI-first” company since announcing a shift in focus at I/O 2017, and it recently unveiled plans for an AI-powered Google Search feature. ‘ia called Bard. But in a sea of ​​buzzwords and acronyms, it can be hard to figure out what these new tools actually do.

ANDROID VIDEO POLICE OF THE DAY

Google Search already uses AI to understand slang and powerful tools like Google Lens and Google Assistant. You might be wondering how Bard is different. The key is in Bard’s conversational skills and ability to answer questions – but there’s a lot more to it than that, so let’s dive into it.

What is the bard and where does he come from?

Simply put, Bard is Generative AI – it’s the generic name for AI models like ChatGPT and DALL-E that can create new content. Generative AIs can create video, audio, and images, but Bard focuses on creating text, especially text that answers your questions in a natural and conversational way.

Bard takes his name from the word meaning “poet” – as in the bard of Avalon, William Shakespeare – in reference to his linguistic ability.

Considering the timing, Bard might seem like a product thrown out the door to compete with ChatGPT. But interestingly, Google actually laid the groundwork for ChatGPT when it released its Transformer deep learning model to the public in 2017, and Bard’s main backend, LaMDA, was announced nearly two years ago. . So OpenAI’s new tool shares a lineage with Google, but Bard himself has been in development for years.

Bard is based on LaMDA, a conversational AI model introduced by Google in 2021

How does the bard work?

Google wants Bard to complement the Knowledge Graph cards you see in search when you perform searches that have a simple answer. While a Knowledge Graph card might give you the definition of a word or insight into a person or place, Bard tries to act on it. NORA Questions, as Google calls them – Searches with no good answer.

To do this, Bard first uses LaMDA language models to understand your question and its context, even if it contains slang terms that search engines have traditionally struggled with. After that, Bard relies on information he finds on the internet to craft a response, which is then turned into the kind of conversational response you might expect from a real person (again, thanks to LaMDA).

Google wants you to use this tool to improve your understanding of topics and make decisions. During a demonstration in Paris, the company asked the chatbot to help it decide which car to buy, then asked about the benefits of electric vehicles. Such features can negate the need to click through search results, but Google is careful to maintain its relationship with websites and content creators. Senior Vice President Prabhakar Raghavan said the following:

As we expand these new generative AI capabilities into our search results, we continue to prioritize the process that allows us to send valuable traffic to a variety of creators and support a healthy and healthy web. open.

When can I use Bard?

In addition to internal Dogfood users, Google has already made Bard available to a select group of trusted testers. The company has announced that it will open a public early access program for the tool in the coming weeks. When beta registration becomes available, we’ll make sure this page is updated with a link and instructions on how to participate.

During the test, Bard will use a lightweight mockup version of LaMDA, which Google says will allow the preview version of the tool to be made available to more users. The company intends to use this testing period to optimize Bard’s accuracy, quality, and speed.

Bard Trial is a standalone utility, but the tool will eventually integrate with Google Search

Eventually, once Bard completes its testing phases, it will be integrated into Google search. At this point, using the feature should be as simple as typing any query into the search bar – you’ll find things are different when Google gives you a full answer in plain English instead of a map and a list of links.

Source: www.androidpolice.com

#Google #Bard #explained #AIpowered #ChatGPT #competitor

Elon Musk teases ‘Master Plan 3’ to be unveiled at Tesla Investor Day

Elon Musk is set to unveil his third master plan to the world on March 1.

The CEO of Tesla/SpaceX/Boring Company/Twitter announced on Twitter that “Master Plan 3” will be announced at Tesla’s Investor Day on March 1. According to the post, the event will be held at Giga Texas, Tesla’s manufacturing facility in the state. Musk says the plan will be “the path to a fully sustainable energy future for Earth,” adding that “the future is bright!”

Musk has been teasing “Master Plan 3” since March of last year. In response to someone expressing excitement about the plan, Musk responded by saying “Tesla’s key issues are going to grow to the extreme size required to lead humanity away from fossil fuels and AI.” But I will also add sections on SpaceX, Tesla and The Boring Company.

If you’re wondering what the previous master plans were, here’s a quick breakdown. As Musk summed it up, Master Plan 1, announced in 2006, consisted of:

  • Building a low-volume car that was bound to be expensive
  • Use this money to develop a mid-size car at a lower price
  • To use THE Money to build an affordable high-volume car
    And…
  • provide solar energy. No kidding, this has been on our site for literally 10 years.

Masterplan 2, which Musk dubbed Part Deux and launched in 2016, consisted of:

  • Build stunning solar rooftops with seamlessly integrated battery storage
  • Expand the electric vehicle product line to address all key segments
  • Through massive fleet learning, develop self-driving skill 10 times safer than manual driving
  • Let your car earn you money when you’re not using it

Tesla is likely to stream Investor Day on March 1, but the company has yet to add a placeholder video to YouTube. Keep your eyes peeled because it would be weird if it didn’t come.

Musk’s announcement comes weeks after the CEO revealed the Cybertruck won’t go into mass production until 2024.

Source: bgr.com


#Elon #Musk #teases #Master #Plan #unveiled #Tesla #Investor #Day

Android 14 calendar: when will it start?

Ryan Haines/Android Authority

TL; DR

  • Google has released its planned release schedule for Android 14.
  • The planned timeline is similar to what Google had for Android 13.
  • The stable launch is expected to take place in August.

It’s been six months since Google released Android 13, and now the company is gearing up to move forward with the next iteration of the operating system – Android 14. Google unveiled its Android 14 timeline, and it looks a lot like it. , the tech giant had Android 13.

Today, Google released the first developer preview build of Android 14. Judging from the organization’s Android 14 release date chart, it looks like there will be at least another preview for developers. After these previews, a few betas will follow before the final release. As you can see, Google expects to achieve platform stability in June.

Since Android 14’s release schedule is so similar to Android 13’s, it’s safe to assume that Google will take a similar approach. Last year, Google had two developer previews and eight betas for Android 13 before releasing the official update. If Google does something similar this year, chances are we’ll see an Android 14 release in August. However, there is no guarantee that there will be no delay.

When it comes to phones running Android 14, you can expect not all versions of the Pixel to hit their core OS update limit yet, which is every Pixel after and including the Pixel 4a 5G. The same goes for Samsung’s latest phones. If you’re unsure if your Samsung phone is still eligible, check out our list, which includes all Samsung devices that will receive four major operating system updates.

The Android 14 release date for the Pixel will likely be ahead of the rest, as always. However, predicting the Android 14 release date for Samsung is a little trickier to pin down. One UI 5 was the fastest version of the operating system released by Samsung, and the company said it wanted the next update to be even faster.

comments

Source: www.androidauthority.com

#Android #calendar #start