Computers & Technology 4

Message boards : Politics : Computers & Technology 4
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 35 · 36 · 37 · 38

AuthorMessage
Dr Who Fan
Volunteer tester
Avatar

Send message
Joined: 8 Jan 01
Posts: 3485
Credit: 715,342
RAC: 4
United States
Message 2150283 - Posted: 4 Jul 2025, 3:49:13 UTC

To err is human. To really foul things up (and get falsely prosecuted) requires a (borked) computer (program).
(U.K.) Post Office scandal: 'Hugely significant' evidence unearthed in computer expert's garage
A damning report into the faulty Post Office IT system that preceded Horizon has been unearthed after nearly 30 years - and it could help overturn criminal convictions.

The document, known about by the Post Office in 1998, is described as "hugely significant" and a "fundamental piece of evidence" and was found in a garage by a retired computer expert.

Capture was a piece of accounting software, likely to have caused errors, used in more than 2,000 branches between 1992 and 1999.

It came before the infamous faulty Horizon software scandal, which saw hundreds of sub-postmasters wrongfully convicted between 1999 and 2015.

... The report, commissioned by the defence and written by Adrian Montagu and his colleague, describes Capture as "an accident waiting to happen", and "totally discredited".

It concludes that "reasonable doubt exists as to whether any criminal offence has taken place".

It also states that the software "is quite capable of producing absurd gibberish", and describes "several insidious faults…which would not be necessarily apparent to the user".

All of which produced "arithmetical or accounting errors".

Sky News has also seen documents suggesting the jury in Pat Owen's case may never have seen the report.

What is clear is that they did not hear evidence from its author including his planned "demonstration" of how Capture could produce accounting errors.
ID: 2150283 · Report as offensive     Reply Quote
Dr Who Fan
Volunteer tester
Avatar

Send message
Joined: 8 Jan 01
Posts: 3485
Credit: 715,342
RAC: 4
United States
Message 2150294 - Posted: 4 Jul 2025, 19:02:12 UTC

This world needs more young men like him!
Microsoft's youngest security researcher started collaboration with the company at just 13 — high school junior filed 20 vulnerability reports last summer, named MSRC Most Valuable Researcher twice
Microsoft has published a blog about one of its youngest and most outstanding security researchers. Rising star ‘Dylan’ began his relationship with Microsoft at age 13, and it has been revealed that he is the single reason why the software giant updated its Bug Bounty Program terms to allow 13-year-olds to participate, a few years back.

Since that incredibly early start with Microsoft, Dylan has gone on to be named on the Microsoft Security Response Center (MSRC) Most Valuable Researcher list for both 2022 and 2024. The Microsoft blog also noted that he “competed at Microsoft’s Zero Day Quest—a premier onsite hacking event in Redmond, Washington—and took home 3rd place” in April 2025.

The Microsoft blog says that he was “analyzing source code behind educational platforms” by age 10 or 11 (5th grade), and actually got in a little trouble for unlocking games on school computers using these newly acquired skills.
ID: 2150294 · Report as offensive     Reply Quote
Dr Who Fan
Volunteer tester
Avatar

Send message
Joined: 8 Jan 01
Posts: 3485
Credit: 715,342
RAC: 4
United States
Message 2150303 - Posted: 5 Jul 2025, 17:21:43 UTC

Should the company behind the AI Chat Bot be held accountable for the teens death?
My opinion yes! I think these Chat Bots can be dangerous. Example - this was in the news several months ago: Telling people to put glue on their pizza to hold the toppings on.

Florida judge rules AI chatbots not protected by First Amendment
An artificial intelligence software company cannot use a free speech defense in a wrongful death lawsuit lodged by the mother of a 14-year-old who died by suicide after developing a crush on a chatbot, a federal judge ruled Wednesday.

Last October, (the mother) sued Character Technologies, the developer of Character A.I., an app that lets users interact with chatbots based on celebrities and fictional people. She claims her son, became addicted to the app while talking with chatbots based off of “Game of Thrones” characters Daenerys Targaryen and Rhaenyra Targaryen. In February 2024, after months of interacting with the chatbots, sometimes with sexual undertones, (the son) sent a message to the Daenerys chatbot, expressing his love and saying he would “come home” to her, according to the complaint. After the chatbot replied, “Please do my sweet king,” (the son) shot himself.
ID: 2150303 · Report as offensive     Reply Quote
Previous · 1 . . . 35 · 36 · 37 · 38

Message boards : Politics : Computers & Technology 4


 
©2025 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.