ChatGPT founder says bitcoin is ‘super logical’ next step for tech
OpenAI boss Sam Altman has labelled bitcoin the “super logical” next step for technological progress, just months after launching his own cryptocurrency. Mr Altman, whose company launched the viral AI assistant ChatGPT last year, made the comments on a recent appearance on The Joe Rogan Experience podcast, during which he also lamented the role of government regulation in the crypto space. “The war on crypto... that makes me quite sad about the country,” Mr Altman said. “I think this idea that we have this global currency that is outside of the control of any government is a super logical and important step on the tech tree.” The price of bitcoin fell sharply in 2021 and 2022, largely driven by the collapse of the FTX crypto exchange. The US Securities and Exchange Commission has since filed lawsuits against other exchanges like Binance and Coinbase as part of a crackdown against the industry. A crypto bill has also been introduced by Senator Elizabeth Warren in an attempt to address “crypto’s use in money laundering, drug trafficking, and financing of terrorism and rogue nations”. The OpenAI boss spoke briefly about his own cryptocurrency project, called Worldcoin, which has faced several controversies since officially launching in July. The project involves collecting people’s biometric data through an iris-scanning orb in exchange for a share of the crypto token WLD. The idea is to use the data to verify each individual’s “unique personhood” in order to ensure that no one is able to claim more than their allotted share of the cryptocurrency. The unique approach has been branded both “outlandish” and “revolutionary” by crypto commentators, with some warning that the sensitive nature of the data means it could be exploited by nefarious actors. Regulators in several countries, including France and Germany, are investigating Worldcoin’s operations to see if it is in violation of data security practices. Worldcoin has acknowledged the privacy concerns, noting in a blog post in August that “everything is optional” and that no personal information needs to be tied to the iris scan. “[The Orb] validates a person’s humanness locally on the device, without needing to send, upload or save images,” the post stated. “By default, the Orb promptly deletes iris images after the creation of the iris code.” Read More Sam Bankman-Fried trial: Billion dollar crypto fortune was ‘built on lies’, prosecutors say How bad is bitcoin for the environment really? Crypto experts discuss bitcoin price predictions What is Solana? The crypto rising 200-times faster than bitcoin
2023-10-10 23:28
Steven Crowder suspended from YouTube for letting Alex Jones guest host
Right-wing YouTuber Steven Crowder is – once again – just a single strike away from
2023-05-20 06:47
Ford electric vehicle owners to get access to Tesla Supercharger network starting next spring
All of Ford‘s current and future electric vehicles will have access to about 12,000 Tesla Supercharger stations in the U.S. and Canada starting next spring
2023-05-26 07:23
Shell Challenged on Net Zero After Fossil-Fuel Investment Boost
Legal & General Investment Management, the UK’s largest asset manager, said it wants Shell Plc to explain how
2023-06-16 12:28
US FCC proposes to force cable TV operators to disclose full pricing
By David Shepardson WASHINGTON The U.S. Federal Communications Commission (FCC) on Tuesday proposed a rule that would require
2023-06-21 05:24
Fact check: Ron DeSantis on Amanda Gorman poem being pulled from a Florida elementary school library
Florida Gov. Ron DeSantis said Friday that he "had nothing to do with" a poem recently being moved from an elementary school library to a middle school library.
2023-05-31 08:30
Milwaukee bankruptcy avoidance plan up for approval in Wisconsin Legislature
A plan to prevent Milwaukee from going bankrupt is expected to win bipartisan approval in the Wisconsin Legislature
2023-06-14 12:15
China's Singles Day festival wraps up with e-commerce giants reporting sales growth
By Casey Hall SHANGHAI (Reuters) -China's largest e-commerce player Alibaba Group said it recorded year-on-year growth over this year's Singles
2023-11-12 21:45
Konami announces it’s on the hunt for in-house ‘Silent Hill’ development team
Revealing their ideal candidates have a streak of darkness, Konami has confirmed it is on the hunt for a new in-house ‘Silent Hill’ development squad – with “maniacal sensibilities”.
2023-11-15 00:47
iPhone 15 Pro: How Apple made the smartphone into a camera like none before it
The iPhone is a lot of things. It's a social networking portal, it's a games console – sometimes it's even a phone. For Apple's Jon McCormack, Apple's vice president for camera software engineering, it's "primarily a camera that you can text from". It wasn't always this way. When Steve jobs introduced the iPhone in 2007, he famously described it is an iPod, a phone and an internet communications device; the first iPhone had a camera, new iPhones are cameras. The pictures that first iPhone turned out were more useful than beautiful. Today, however, the iPhone's pictures have grown up, and it is now the most popular camera in the world. Now the question is how sharp the pictures should be, and there has even been some criticism that the pictures it turns out are too sharp, if anything. The iPhone's camera is no longer just a useful addition but is used in professional contexts, and is often given as the main reason to upgrade to new models. The new iPhone 15s, in particular the premium Pro and Pro Max, continue Apple's mission to turn its smartphones into cameras like nothing in the history of photography. They have new image formats, the addition of extra focal lengths, and the iPhone 15 Pro Max even includes a 5x lens that makes use of a "tetraprism" lens that bounces light around inside the phone to add dramatically more zoom without making the phone any bigger. All of that additional hardware works in collaboration with improved software: users no longer have to click into portrait mode, for instance, because the camera automatically captures depth information when taking a picture of people, so that background blur can be added and edited even after the photo has been taken. Apple has also added a host of features that many people are unlikely ever to even look at, let alone use, but are important to professionals. They include the addition of Log encoding and the Academy Color Encoding System – both key to those who need them. Apple also says that the new iPhone has "the equivalent of seven pro lenses", despite really only having three; what they mean is that you can choose different crops, which is in part an attempt to appeal to those professional photographers who stubbornly say that they will only ever work with a 50mm lens, for instance. (Those new lens choices are not only a cropped version of the existing lenses, says McCormack, since the phone also has custom neural networks specifically designed to optimise images at that focal length.) Those complex new features are a reminder that the iPhone is many things to many users: some may simply want to remember important events, or snap pictures of their pets. Others might be truly professional photographers, needing to rely on their iPhone to capture valuable and fleeting events. Some people are, no doubt, both – and Apple is aware that the iPhone has to be both, too. "For us, what we feel is really important – especially since computational photography started to blur the line between hardware and software, and really enable anybody to take stunning shots with minimal effort – is making sure that that tool that we have in your pocket is adapting to your needs," says Maxime Veron, Apple's senior director for iPhone product marketing. "So if you're just trying to take a quick photo of your kids can get out of the way and just allow you to do that. And if you want to create a professionally created Hollywood style video, it can also give you the customisation and the power to do that." McCormack says that Apple builds the camera from "the core belief that everybody has got a story that is worth telling". For some people that story might be their child taking their first steps, captured in a video that will be shared with only a few people. Or it might be a photojournalist taking images that are going to be shared with millions. "Our belief is that your level of technical understanding shouldn't get in the way of you being able to tell that story," he says. High-end cameras have often required their users to think about a whole host of questions before they even get to actually pressing the button to take a picture: "the temperature of light, the amount of light, the direction of light, how fast is the subject moving? What are the skin tones?" notes McCormack. "Every second that you spend thinking about that, and playing with your settings and things like that, are seconds that you are drawn out of the moment," he says. "And what we want to create is this very deep connection between the photographer, the videographer and the moment." He points to the action button on this year's Pro models, which can be programmed to launch the camera with a push. "It's all about being able to say all of this crazy complexity of photography, or videography – Apple's taken that, and understood that, and hidden that from you," he says. "You as a photographer, you get to concentrate on the thing that you want to say, and finding that decisive moment, finding that beautiful framing, that says the thing that you want to say. "But the motivation for all of this and using all of this crazy, great computational photography, computational videography, is that we don't want to distract you from telling the story that you want to tell." That has meant building the iPhone's camera in a way that the features "unfold", he says. "Out of the box, we are going to give you an amazing thing that is going to cover most of your moments, with lots of dynamic range, lots of resolution, zero shutter lag, so you can capture the moment. "But of course, there are folks who are going to look at this and say, you know, I've got a very specific and very prescriptive vision," he says. He points to a variety of new tools that are built into the phone, such as the ProRAW format, which makes huge files and is not especially useful to most – but can be key to someone who really wants to ensure they are able to process every detail of a photograph after it is taken. Those are hidden within settings, there for the people who need them but not troubling those who don't. Veron also notes that many of those extra features are enabled by "an amazing ecosystem of third party partners" who make apps that allow people to get features they are looking for. It is a reminder of just how much is going on as soon as someone takes a picture with the iPhone. First, light travels through one of Apple's three lenses and hits a 48 megapixel sensor – but that's just the beginning of a long process of computational photography that analyses and optimises that image. The picture that is taken is not just the one image, for example: it is actually made up of multiple exposures, with more or less light, that can then be merged into a picture with the full dynamic range. "This year for the first time, we merge them in a larger resolution," says McCormack. It takes one image in 12 megapixels, to give a fast shutter speed and plenty of light, by combining pixels together; then it grabs a 24-megapixel frame, which collects the detail. "Then we register those together and use a custom machine learning model to go and transfer the detail from the 48 over into what has now become a 24." That creates something like the negative in old camera terms, which the iPhone’s processor can then get to work on, using parts of its chip focused on machine learning. "We use the neural engine to go decompose that photograph, bit by bit." It will notice if people have different skin tones, and develop those parts of the image accordingly; hair, eyes, a moving background and more are all taken to pieces and optimised on their own. (The intensity of that process has occasionally led to questions over whether the phone is working too hard to make its images look good.) Then there's yet more work for the camera system. The iPhone uses tonemapping to ensure that images pop on the bright screens of modern iPhones, but also that they still look bright on a compressed image that might be sent around the internet; one of the many changes that smartphones have brought to photography is that, for the first time, the photos are mostly looked at on the same device they were taken with, but that they can also be sent and seen just about anywhere. If the image is taken using night mode, then there's even more work, with new tools that ensure that colours are more accurate. And that isn't even mentioning portrait mode, which when it registers that there is a person (or a pet) in the frame will gather the relevant depth information to ensure that the background can be manipulated later. That whole process – those five paragraphs, and thousands of calculations by the phone – happen within the tiniest moment after pressing the button to take the photo. The phone may look as if it is serenely offering up an image to its users, but it has been busily working away in the background to ensure the picture is as accurate and vibrant as possible. All that work done by the camera and the rest of the device depends on a variety of choices made not only by the iPhone but by Apple, which accounts for the look of the modern iPhone picture – Veron says that its aim in making those decisions is to make "beautiful, true-to-life memories in just one click". McCormack is clearly keenly aware of the responsibility of that task; his vision decides what the world's memories look like. "This is your device that you carry with you all time the time, and we want to be really, really thoughtful of that," he says. That responsibility carries into the design of the camera within the phone: rumours had suggested that this year's model would include a "periscope" design for the long zoom, bouncing the light through the length of the iPhone, but McCormack says that Apple went for the five-way prism to ensure that it could "both retain the industrial design that we want, to just make iPhone feel so good in your hand, but also be able to get that extra focal length". "It is just of one of those crazy things – only Apple is going to do something like that. And I'm really glad that that's the way we think about product." Read More Tim Cook says Vision Pro release is on track: ‘I watched Ted Lasso Season 3 on it’ Apple Store goes offline as Apple opens pre-orders for iPhone 15 Apple to update iPhone 12 after fears over radiation iPhone 12 is not emitting dangerous radiation, Apple says, amid fears of Europe ban France’s iPhone 12 ban could spread across Europe, regulators say Everything Apple killed off at iPhone 15 event
2023-09-18 22:27
US lawmakers seek new law to protect TikTok user information
By David Shepardson WASHINGTON A bipartisan group of six senators and two members of the House of Representatives
2023-06-15 02:26
Inside Titanic director James Cameron's obsession with the deep ocean
Public interest in the deep ocean went into a frenzy this week as the search for the doomed Titan submarine played out – and Oscar-winning film director has made no secret of the fact that he is obsessed with the subject. Since it emerged on 22 June that the Titan was destroyed in what US authorities called a “catastrophic implosion”, Cameron has been telling media outlets that he knew what the five-man crew’s fate was since Monday, four days earlier. After calling up his “contacts in the deep submersible community” Cameron said he had already ascertained that the vessel had been destroyed in an implosion. “I felt in my bones what had happened.” Sign up to our free Indy100 weekly newsletter But why does Cameron know so much about the ocean depths? Titanic, Avatar and The Abyss First of all, Cameron has made a lot of films about the bottom of the sea. His 1997 film, Titanic, won 11 Oscars and was the first movie to earn more than $1bn worldwide, and Cameron went deep on his research – literally. The filmmaker has visited the real-life wreck of the Titanic 33 times, making his first trip in 1995 to shoot footage for the film. One of those dives even involved getting trapped with the wreck for 16 hours, with currents of water holding the director’s submarine at the bottom of the ocean. He has even written a book about his experiences, Exploring The Deep, which includes details of his dive journey, photos and maps from his own explorations of the wreck. He told ABC News: “I actually calculated [that] I've spent more time on the ship than the captain did back in the day.” Long before Titanic, Cameron directed The Abyss in 1989. The premise of the film is that an American submarine sinks in the Caribbean – sound familiar? That prompts a search and recovery team to race against Soviet vessels to recover the boat. Meanwhile, the last movie in Cameron’s famous Avatar franchise, The Way of Water, is set on the aquatic ecosystems of a world 25 trillion miles from Earth. "Some people think of me as a Hollywood guy … (but) I make 'Avatar' to make money to do explorations," Cameron told The Telegraph. Going even deeper In 2012, Cameron went a step further, plunging nearly 11km down to the deepest place in the ocean, the Mariana Trench in the western Pacific. The filmmaker made the solo descent in a submarine called the Deepsea Challenger, and it took more than two hours to reach the bottom. The submarine he used was years in the making, designed by Cameron himself with a team of engineers. The trip was only the second manned expedition to the Mariana Trench. The first was in 1960, when US Navy Lieutenant Don Walsh and Swiss scientist Jacques Piccard descended to the ocean floor. “It was absolutely the most remote, isolated place on the planet,” Cameron said in a later interview. “I really feel like in one day I've been to another planet and come back.” He was even underwater when 9/11 happened His obsession with the ocean goes back to age 17, he told the New York Times, when he learned to scuba dive, when he said he felt like he had discovered the "keys to another world”. And between making Titanic in 1997 and Avatar in 2009 Cameron didn’t make a feature film. But he did make documentaries about sea exploration. One of those, 2003’s Ghosts of the Abyss, showed Cameron's travels to the Titanic, while the other, 2005’s Aliens of the Deep, saw Cameron team up with NASA scientists to explore the sea creatures of mid-ocean ridges. Cameron’s fascination even meant he was inside a submersible vessel exploring the Titanic on 11 September 2001, when terrorists flew two passenger jets into the World Trade Centre. It was only after the now-68-year-old director and his crew finished their expedition and returned to the main ship that Cameron learned what had happened. “What is this thing that’s going on?” Cameron asked the late actor Bill Paxton, who played treasure hunter Brock Lovett in the film. “The worst terrorist attack in history, Jim,” Paxton said. Cameron realised he “was presumably the last man in the Western Hemisphere to learn about what had happened,” he told Spiegel in 2012. Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.
2023-06-23 20:29
You Might Like...
Humane AI Pin: Much-hyped tech product launches and makes major mistake in its first outing
Quantum-Si Appoints Biotech Executive and Entrepreneur, Amir Jafri, to its Board of Directors
Musk hints at more Tesla price cuts, with autonomy still tricky
Elon Musk Announces New Company xAI as He Seeks to Build ChatGPT Alternative
What’s Trending Today: DeSantis Twitter Glitch, Remembering Tina Turner, Oakland A’s
TikTok's turbulence trend explained. Why is it being called wild?
Apple tumbles, drags Wall Street lower as fears grow over China iPhone curbs
Rise of AI chatbots ‘worrying’ after man urged to kill Queen, psychologist warns
