Loading. Please wait...
Webopac
Home
Apply for a Library Card
My Account
Browse
DVDs
New Items
My Account
Email Library
Search Hotspots
Title Suggestion
← Back to Search
Holding Details
Barcode
30053003801704
Home Location
Paris-Bourbon
Call No
006.3 YUDK
Title
If anyone builds it, everyone dies : why superhuman AI would kill us all / Eliezer Yudkowsky & Nate Soares.
Author
Yudkowsky, Eliezer, 1979- author.
Collection
NEW: Adult 000-099
Reserve Item
Actions
Email Item
Save to List
Reserve Item
Copies
Status
Home Location
Barcode
Call No
Created On
Issue Name
Circ Status
Paris-Bourbon
30053003801704
006.3 YUDK
10/20/2025
Available
Catalog Details
Personal Name
Yudkowsky, Eliezer, 1979- author.
Title Statement
If anyone builds it, everyone dies : why superhuman AI would kill us all / Eliezer Yudkowsky & Nate Soares.
Edition Statement
First edition.
Production, Publication, Distribution, Manufacture, and Copyright Notice
New York : Little, Brown and Company, 2025.
Physical Description
xii, 259 pages ; 25 cm
Content Type
text txt rdacontent
Media Type
unmediated n rdamedia
Carrier Type
volume rdacarrier
Bibliography, Etc. Note
Includes bibliographical references and index.
Summary, Etc.
"In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next. For decades, two signatories of that letter--Eliezer Yudkowsky and Nate Soares--have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us--and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn't even be close. How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? ... Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive"-- Provided by publisher.
Subject Added Entry - Topical Term
Artificial intelligence.
Subject Added Entry - Topical Term
Artificial intelligence Social aspects.
Subject Added Entry - Topical Term
Human beings Extinction.
Book Reviews
Create Review
Email Catalog Details
Enter Your Email Address
Create Review
Write a review
Rating
Patron Login
Barcode or Email
Password/PIN
Pick-up Location
Paris-Bourbon
Millersburg
Millersburg Locker
Forgot Password?
Password Reset
Old Password
New Password
Confirm Password
New Card Registration (must be 18 years old)
Email Library
From
Your Email
Message Type
Send Email To (Library Location)
Please Select Library Location
Paris-Bourbon
Millersburg
Millersburg Locker
Message
Notification
Select Holding Copies
Save Item to List
Suggest a Title!
Title of item
Author of item
Preferred Format:
Book
Ebook
Audio
EAudio
Video
EVideo
Willing to accept the following format(s)
Book
Ebook
Audio
EAudio
Video
EVideo
ISBN/ASIN of Item (if known)
If we are unable to obtain the item or the item doesn't meet our collection development guideliness, are you willing to get the item through InterLibrary Loan
Yes
No
Comments
Notify me at my email address when you receive this