online gambling singapore online gambling singapore online slot malaysia online slot malaysia mega888 malaysia slot gacor live casino malaysia online betting malaysia mega888 mega888 mega888 mega888 mega888 mega888 mega888 mega888 mega888 FAKE DATA IMPROVED THE PERFORMANCE OF ROBOTS BY 40%

摘要: The latest study showed expanding data sets with “fake data” offered at least 40% increase in the performance of robots. A new method widens training data sets for robots that operate with soft things like ropes and fabrics, or in congested situations, taking a step toward creating robots that can learn on the fly like people do.

 


WHAT DO WE MEAN BY SAYING FAKE DATA?

The program was created by robotics experts at the University of Michigan and might reduce the amount of time it takes to learn new materials and settings from a week or two to a few hours.

In simulations, the larger training data set more than doubled the success rate of a robot looping a rope around an engine block and increased it by more than 40% from that of a physical robot performing the same task.

images/20220721_3-1.jpg

▲圖片來源:dataconomy

Researchers concentrated on three characteristics for their fake data.

That is one of the jobs a robot mechanic would need to be competent at. However, according to Dmitry Berenson, U-M associate professor of robotics and senior author of a study presented today at Robotics: Science and Systems in New York City, learning how to manipulate each unfamiliar hose or belt would require extremely large amounts of data, likely gathered for days or weeks.

During that period, the robot would experiment with the hose, extending it, joining its ends, wrapping it around objects, and so on, until it was aware of all the possible motions the hose could make.

“If the robot needs to play with the hose for a long time before being able to install it, that’s not going to work for many applications,” stated Berenson.

Indeed, a robot coworker that required that much time would probably not be well received by human mechanics. Therefore, Berenson and Peter Mitrano, a robotics PhD student, modified an optimization algorithm to allow computers to make some of the generalizations that people do, such as forecasting how dynamics seen in one instance would replicate in others.

In one instance, the robot maneuvered cylinders across a packed floor. The cylinder sometimes didn’t make contact with anything, but other times it did and the other cylinders moved as a result.

images/20220721_3-2.jpg

▲圖片來源:dataconomy

Researchers concentrated on three characteristics for their fake data.

If the cylinder does not contact with anything, the process can be repeated anywhere on the table where the trajectory does not lead it into other cylinders. A human would comprehend this, but a robot would need to find out. And, rather than undertaking time-consuming experiments, Mitrano and Berenson’s program can provide variants on the outcome of the initial experiment that help the robot in the same way.

Researchers concentrated on three characteristics for their fake data. It has to be relevant, diversified, and legitimate. For example, if you’re only interested in the robot moving the cylinders on the table, data on the floor is irrelevant. On the other hand, the fake data must be diversified — all regions of the table and all viewpoints must be studied.

轉貼自: dataconomy.com

若喜歡本文,請關注我們的臉書 Please Like our Facebook Page: Big Data In Finance

 


留下你的回應

以訪客張貼回應

0
  • 找不到回應

YOU MAY BE INTERESTED