Friday, May 27 2022

image: A team of coders and marketers show that computers can write like humans, and they explain why it matters.
to see After

Credit: wine photo by Pier Demarten on Unsplash. Illustration by Richard Clark/Dartmouth College.

According to a study by Dartmouth College, Dartmouth Tuck School of Business and Indiana University, artificial intelligence systems can be trained to write human-like product reviews that help consumers, specialists marketing and professional critics.

The research, published in the International Journal of Marketing Researchalso identifies the ethical challenges raised by the use of computer-generated content.

“Review writing is a challenge for humans and computers, in part because of the overwhelming number of distinct products,” said Keith Carlson, PhD researcher at the Tuck School of Business. “We wanted to see how artificial intelligence can be used to help the people who produce and use these reviews.”

For the research, the Dartmouth team set out two challenges. The first was to determine whether a machine can learn to write original human-grade reviews using only a small number of product features after being trained on an existing content set. Second, the team set out to see if machine learning algorithms could be used to write review summaries for products for which many reviews already exist.

“Using artificial intelligence to write and synthesize reviews can create efficiencies on both sides of the market,” said Prasad Vana, assistant professor of business administration at the Tuck School of Business. “The hope is that AI can benefit reviewers facing larger writing workloads and consumers who have to sort through so much product content.”

The researchers focused on wine and beer reviews because of the wide availability of material to train the computer algorithms. The descriptions of these products also feature relatively targeted vocabularies, an advantage when working with AI systems.

To determine if a machine could write useful reviews from scratch, the researchers trained an algorithm on about 180,000 existing wine reviews. Metadata tags for factors such as product origin, varietal, rating and price were also used to train the machine learning system.

By comparing machine-generated reviews to human reviews for the same wines, the research team found agreement between the two versions. The results remained consistent even when the team challenged the algorithms by changing the amount of input data available for reference.

The machine-written material was then evaluated by non-expert study participants to test whether they could determine whether the reviews were written by humans or by a machine. According to the research paper, participants were unable to distinguish between human and AI-generated reviews with any statistical significance. Moreover, their intention to purchase a wine was similar in human-generated and machine-generated wine reviews.

Having discovered that artificial intelligence can write believable wine reviews, the research team turned to beer reviews to determine the effectiveness of using AI to write “review summaries.” . Rather than being trained to write new reviews, the algorithm was instructed to aggregate items from existing reviews of the same product. This tested the AI’s ability to identify and deliver limited but relevant product information based on a large number of dissenting opinions.

“Writing an original review tests the expressive capacity of the computer on the basis of a relatively small set of data. Writing a review review is a related but separate task where the system is expected to produce a review that captures some of the key ideas present in an existing set of reviews for a product,” said Carlson, who conducted the research then. that he was a PhD student in computer science at Dartmouth.

To test the algorithm’s ability to write review summaries, the researchers trained it on 143,000 existing reviews of more than 14,000 beers. As with the wine dataset, the text of each review was paired with metadata including product name, alcohol content, style, and ratings from the original reviewers.

As with the wine reviews, the research used independent study participants to judge whether the machine-written summaries captured and summarized the opinions of many reviewers in a useful and humane way.

According to the article, the model succeeded in taking reviews of a product as input and generating a review summary for that product as output.

“Our modeling framework could be useful in any situation where detailed product attributes are available and a written product summary is required,” Vana said. “It’s interesting to imagine how this could benefit restaurants that can’t afford sommeliers or independent sellers on online platforms that can sell hundreds of products.”

Both challenges used a deep learning neural network based on a transformer architecture to ingest, process, and output review language.

According to the research team, computer systems are not meant to replace professional copywriters and marketers, but rather to help them in their work. A machine-written review, for example, could serve as a first draft of a time-saving review that a human reviewer could then revise.

Research can also help consumers. Review reviews, like the ones on beer in the study, can be extended to the constellation of products and services in online marketplaces to help people who have little time to read lots of product reviews.

In addition to the benefits of machine-written reviews, the research team highlights some of the ethical challenges presented by using computer algorithms to influence human consumer behavior.

Noting that marketers could gain greater acceptance of machine-generated reviews by misattributing them to humans, the team advocates for transparency when computer-generated reviews are offered.

“As with other technologies, we have to be careful about how this breakthrough is used,” Carlson said. “If used responsibly, AI-generated reviews can be both a productivity tool and can support the availability of useful consumer insights.”

Researchers who contributed to the study include Praveen Kopalle, of Dartmouth’s Tuck School of Business; Allen Riddell, Indiana University, and Daniel Rockmore, Dartmouth College.

###

About Dartmouth

Founded in 1769, Dartmouth is a member of the Ivy League and consistently ranks among the top academic institutions in the world. Dartmouth has forged a singular identity for combining its deep commitment to exceptional liberal arts and undergraduate higher education with distinguished research and scholarship in the arts and sciences and its four main graduate schools – the Geisel School of Medicine, the Guarini School of Graduate and Advanced Studies, Thayer School of Engineering, and the Tuck School of Business.


Previous

Redfin settles lawsuit alleging housing discrimination

Next

Key actions to build a solid investment portfolio

Check Also