April 24, 2024

Experts say judges are likely to address concerns about AI on their own and create their own rules for the technology in court.

Last week, U.S. District Judge Brantley Starr for the Northern District of Texas required attorneys appearing to certify that they did not use artificial intelligence programs such as ChatGPT to draft documents without human checks, which could Be a pioneer. accuracy.

“We at least remind lawyers, otherwise they might not notice, that they can’t just trust these databases,” Trump-appointed judge Starr told Reuters. “They have to actually verify it themselves with traditional databases.”

Experts interviewed by Fox News Digital believe that the judge’s move to enact AI promises for lawyers is “very good,” and that the action plan is likely to be repeated in the technological race to build more powerful AI platforms.

AI won’t appear in court unless lawyers prove it’s been verified by humans, says Texas judge

AI guidance in courtrooms may be left to individual judges, experts told Fox News Digital. (iStock)

“I think this is an excellent way to ensure that AI is being used properly,” said Christopher Alexander, chief communications officer at Liberty Blockchain. “Judges are just using the old adage of ‘trust but verify.’”

“The risk that the reasoning might be wrong or biased is too great,” Alexander added. “Legal research is much more complicated than typing numbers into a calculator.”

Starr said he had plans to show lawyers that AI could hallucinate and fabricate cases, and posted a statement on the court’s website warning that chatbots would not be sworn to uphold the law like lawyers.

Artificial intelligence has cost nearly 4,000 jobs in the U.S., report says

“These platforms, in their current state, are prone to hallucinations and bias. About hallucinations, they make things up — even quotes and quotes,” the statement said.

“Unbound by any sense of responsibility, honor, or justice, such programs act according to computer code rather than belief, based on programming rather than principle,” the notice continued.

artificial intelligence logo

Experts say judges are likely to address concerns about AI on their own and create their own rules for the technology in court. (Josep Lago/AFP via Getty Images)

Phil Siegel, founder of the nonprofit CAPTRS (Simulation Center for Advanced Preparedness and Threat Response), which focuses on using simulation games and artificial intelligence to improve society’s disaster preparedness, said the judge was cautious in his AI commitment requirements, adding that AI Can take a future role in the justice system.

“At this point, that’s a sensible position for judges to take. Large language models can hallucinate, just as humans can hallucinate,” Siegel said.

“However, it won’t be long before more focused datasets and models emerge to address this problem,” he continued. “In most specific fields, like law, but also in architecture, finance, etc.”

He pointed out that in the legal field, how to create a data set that collects all case law and civil and criminal laws by jurisdiction and use it to train artificial intelligence models.

AI compared to gun debate as college students stand at technology crossroads

“These databases could be built using citation markers that follow a specific agreed-upon scheme, which would make it harder for humans or artificial intelligence to hallucinate or miscite,” Siegel said. Coordination. A citation may be genuine, but when it comes from an unrelated jurisdiction, it cannot be used in court. At this point, with this data set and trained AI available, the adjudication will become moot significance.”

ChatGPT students cheat

A Texas judge may have been a pioneer when he asked lawyers in court to certify that they had not used artificial intelligence programs like ChatGPT to draft their documents without human checks for accuracy. (Getty Images)

Aiden Buzzetti, president of the Bull Moose Project, a conservative nonprofit dedicated to “identifying, training and developing the next generation of America First leaders,” said Starr’s request is unimpressive given the lack of legislation and guardrails around AI. Accident.

“In the absence of active legislation to ensure the quality of AI products, it is entirely understandable that individuals and institutions will create their own rules regarding the use of AI materials,” Buzzetti said. Occupations involve a longer period of risk.”

Older generations lag nation in AI knowledge: poll

Starr’s plan comes after a New York judge threatened to sanction a lawyer for using ChatGPT for court briefings that cite bogus cases.

However, the Texas judge said the incident did not affect his decision. Instead, he began working on his AI rules during a technical panel discussion at a conference hosted by the U.S. Court of Appeals for the Fifth Circuit.

Teachers take AI issues into their own hands amid warning tech poses ‘biggest threat’ to schools

American classroom

PEN America chief executive Suzanne Nossel said in a statement that taking books from school libraries would teach students that the books are dangerous. (iStock)

Leaders in other fields have also taken concerns about AI and the lack of regulation around the powerful technology into their own hands, including teachers in the UK. Eight educators wrote a letter to The Times of London last month warning that despite AI Smart can serve as a useful tool for students and teachers, with risks from the technology identified as the “biggest threat” to schools.

Educators are forming their own advisory boards to discuss which AI components educators should ignore in their work.

Click here for the Fox News app

“As leaders of state and independent schools, we believe that artificial intelligence is the greatest threat but also potentially the greatest benefit to our pupils, staff and schools,” the UK Teachers’ Union said in a letter to The Times wrote in. “Schools are confused by the rapid changes in AI and seek reliable guidance on the best way forward, but whose advice can we trust?”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *