Category: Science&Technology

  • Certain cells secrete a substance in the brain that protects neurons

    Certain cells secrete a substance in the brain that protects neurons

    Certain cells secrete a substance in the brain that protects neurons, USC study finds
    This image depicts capillaries in a mouse brain. Pericytes are labeled with a fluorescent red protein. Credit: A. M. Nikolakopoulou – Zlokovic Lab

    USC researchers have discovered a secret sauce in the brain’s vascular system that preserves the neurons needed to keep dementia and other diseases at bay.

    The finding, in a mouse model of the human , focuses on specific cells called pericytes and reveals that they play a previously unknown role in brain health. Pericytes secrete a substance that keeps neurons alive, even in the presence of leaky blood vessels that foul brain matter and result in .

    The study, which appears today in Nature Neuroscience, helps explain the cascade of problems that lead to neurodegeneration after stroke or , as well as in diseases like Alzheimer’s and Parkinson’s—and suggests a potential strategy for therapy.

    “What this paper shows is if you lose these vascular cells, you start losing neurons. The link with neurodegeneration was really not that clear before,” said senior author Berislav Zlokovic, director of the Zilkha Neurogenetic Institute at the Keck School of Medicine of USC.

    The discovery comes at a time when scientists are beginning to understand Alzheimer’s disease as the result of multiple processes that begin long before memory loss sets in. Many researchers are shifting their focus from the amyloid plaques that accumulate in the brain later in life toward other targets earlier in the timeline.

    Zlokovic, for example, studies the layers of cells that make up blood vessels in the brain. His previous research shows that the more permeable, or leaky, a person’s brain capillaries, the more cognitive disability they have.

    For this new experiment in mice, Zlokovic zeroed in on pericytes in the brain’s . Pericytes help regulate blood flow and keep blood vessel walls sealed tight. When researchers artificially removed pericytes, they saw rapid degeneration of the blood-brain barrier, a slowdown of blood flow and the loss of brain cells.

    To further understand the role of pericytes, the scientists infused mice with a protein, or , secreted by pericytes in the brain and not found elsewhere in the body. They found that, even with pericyte cells artificially removed, the growth factor protected neurons and the brain cells didn’t die. The results persisted even with constricted .

    Because these pericytes are implicated in many diseases—including Huntington’s, Parkinson’s, stroke, brain trauma and —the research offers intriguing possibilities for further investigation.

    “This opens up an entirely new view of the possible pathogenesis of Alzheimer’s disease,” Zlokovic said.

    This content was originally published here.

  • Survival of natural world is in balance, says wildlife chief

    Survival of natural world is in balance, says wildlife chief

    The survival of the natural world upon which humanity depends hangs in the balance, according to the new chair of the global scientific body for biodiversity.

    Ana María Hernández said she did not know if society could make the major changes needed to stop the annihilation of wildlife, which some scientists thought was the start of a mass extinction. It would be very difficult to shift society out of its current “comfort zone” of business-as-usual, but she thought the much higher environmental awareness among young people was a reason for great optimism.

    Hernández is chair of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), which in May published the most thorough planetary health check ever undertaken. It concluded that human society was in jeopardy from the accelerating decline of the Earth’s natural life-support systems, with a million species at risk of extinction.

    The destruction of nature by the razing of forests, overhunting and fishing, pollution and the climate emergency has slashed wildlife populations by 60% since 1970 and plant extinctions are running at a “frightening” rate, according to scientists. The web of life that provides the food, clean air and water on which society ultimately depends is being dismantled by unsustainable methods of production and wasteful lifestyles, say conservationists.

    “If I look at my generation, the people who are running the companies and countries and society in general, I don’t know if we are going to be able at this point in time to make extreme transformative changes, because we love to do the things we way we always do things,” Hernandez said. “We are in our comfort zone and it is very difficult to change. If we do not, then I am afraid the next 50 years are going to be very dramatic for us.”

    “But if I look at young people, I am optimistic. They are people who have grown up with environmental concern,” she said. “I think we are going to have this transformation from the old society to a new environmental society in this 50 years. But if we cannot change business-as-usual we are going to be in trouble.”

    Hernández will oversee a series of new projects by IPBES in the coming years, including a detailed assessment of the connections between biodiversity and food, water and human health. “If we don’t understand the relationship between biodiversity and the very basic needs of our lives then we are not going to understand how deeply biodiversity is important to maintain our own survival,” she said.

    Pinterest

    IPBES will also examine the root causes of the destruction of ecosystems, beyond the direct exploitation of wildlife, to include poverty, conflicts and other political and economic factors. There will also be major reports on how global heating is harming biodiversity and the relationship between business and biodiversity.

    Some senior conservationists back moves towards declaring half of the Earth as a protected zone to rescue the natural world. But Hernández said the precise area designated was less important than having effective, on-the-ground enforcement, as some existing places were national parks only on paper.

    “I don’t care about the number, you can have 30%, 50% or 70%,” she said. “But if you don’t have really efficient tools inside those areas to reduce the causes of biodiversity loss, you are not doing anything.”

    Hernández is a Colombian expert in international relations and biodiversity, and said her work had been inspired by the incredible natural environments found in her home nation, such as the rainforests in Guainía. She said people could help preserve wildlife by buying sustainable food and products. “These things will help a lot,” she said. “I am reducing meat consumption at my house.”

    She also said people should use their votes for change: “People often do not vote because they know the proposals of the politicians, but because the person is charismatic or famous. Voters must understand the environmental proposals of the candidates.”

    This content was originally published here.

  • MIT Researchers Present Cryptographic System That Secures Almost Anything – CoinWire

    MIT Researchers Present Cryptographic System That Secures Almost Anything – CoinWire

    The Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT has announced a newly-developed system designed to keep online communication secure by running them through complex mathematical algorithms.

    Called Fiat Cryptography, the code is currently being used to secure about 90 percent of communications sent and received by Google’s Chrome browser.

    Google has since used Fiat Cryptography in its open-source cryptographic library, BoringSSL. The library is being used by a multitude of programs including the Chrome browser and numerous Android apps.

    CSAIL researchers presented their paper on Fiat Cryptography at the EEE Symposium on Security and Privacy in May. CSAIL graduate student Andres Erbsen was credited as first author while fellow grad students Jade Philipoom and Jason Gross were listed as co-authors. They were joined by Adam Chlipala and Robert Sloan.

    Faster than manual encryption

    The code can automatically generate and simultaneously verify optimized cryptographic algorithms for all hardware platforms. Perhaps most notably, this process used to be done by hand as recently as just a year ago, as the technology was deployed at MIT labs last year.

    Researchers first looked for a solution in C programming and assembly languages and then transferred the best-performing algorithms they identified for each architecture to their own code library.

    They then converted programming languages into code using a compiler and proffed them automatically with a mathematical theorem prover called Coq. Each iteration was then tested until the best-performing one is identified for each particular chip architecture.

    The researchers found that their automated process was able to match the performance of the best code manually written by humans and was able to complete it much faster.

    It’s basically like taking a process that ran in human brains and understanding it well enough to write code that mimics that process,” said Chlipala, who along with rest of the research team is working on making their compiler find optimized algorithms even faster.

    This content was originally published here.

  • Canadian Astronaut David Saint-Jacques Is Returning from Space

    Canadian Astronaut David Saint-Jacques Is Returning from Space

    A week is separating Canadian astronaut David Saint-Jacques from Earth. The astronaut will return to Earth because his mission to the International Space Station is coming to an end. On Wednesday, it was day 199 for Saint-Jacques, and he says that Monday night will return to Earth. Even if the returning to gravity will be painful for the body than living six months in Space, he is excited to go back to his wife and children.

    How Was Saint-Jacques Experience in Space?

    The astronaut says he now knows how to fly, how to do flips, he can now stay upside down, and the sense of gravity doesn’t bother him anymore. The little issue will be here on Earth; he says that he should learn how to walk again. The last moments of his experience in Space will be enjoyed for sure, and he will return with a Russian Soyuz capsule.

    Besides this, Saint-Jacques is saying that he’s trying to soak all in because once he is on Earth, everything will seem like a dream. His first Space mission was with astronaut Anne McClain from NASA and cosmonaut Oleg Kononenko from Russia. The purpose started on December 3, and they arrived with a Soyuz capsule.

    Also, he is officially the first Canadian who stayed in Space that long. Saint-Jacques participated in a six and a half hour spacewalk on April 8, and this one is a highlight for him. He is the first Canadian to walk in Space since 2007 when David Williams has done it. His presence there was useful in many ways; he used the Station’s Canadarm 2, which is a robotic arm, for catching the SpaceX Dragon cargo. This operation was the first time again for a Canadian.

    Finally, the Canadian astronaut David Saint-Jacques is saying that he will miss the view from Space, but he misses a lot more the little pleasures from our planet. He is packing his belongings and could take a small box aboard the Soyuz, the rest of the items will be sent with the Dragon spacecraft.

    The post Canadian Astronaut David Saint-Jacques Is Returning from Space appeared first on Canadian Homesteading.

    This content was originally published here.

  • The mainstream media was WRONG again with its phony climate change predictions – DC Dirty Laundry

    The mainstream media was WRONG again with its phony climate change predictions – DC Dirty Laundry

    (Natural News) Back in 2008, ABC aired a global warming propaganda series entitled, “Earth 2100” that predicted major calamities to occur all around the world by the year 2015 as a result of climate change. But, wouldn’t you know it? None of the show’s fear-mongering prognostications ever actually came to pass.

    During a recent episode of his radio show, conservative commentator Rush Limbaugh mocked the contents of a trailer for “Earth 2100” that once aired as part of a “Good Morning America” promotion. In it, the on-screen subtitles suggested that past civilizations like the Roman and Mayan empires collapsed, not because of war and conflict, but because of climate change.

    “100 years from now, if New York is abandoned, I picture some explorer coming to Manhattan saying, ‘Those ignorant people! How on earth could they have ever expected to survive?’” one of the film’s “experts” is heard saying during the promo, along with scary music and imagery of the planet dying.

    “We’re running out of oil. We’re running out of water. We’re running out of atmosphere. We’re poisoning the whole planet,” another “expert” declares during the promo, amid graphics of oil wells, deserts, and street pollution.

    You can watch the full clip of the promo below:

    Global warming fear-mongers hilariously claimed that a gallon of milk would be $13, and a gallon of gas more than $9, by 2015

    It’s admittedly difficult to get through this entire promo clip without at least a chuckle, seeing as how the claims made in it are about as credible as Al Gore insisting that the normal tide cycles in Miami are a consequence of global warming-induced rising ocean levels.

    But this is what passes as “science” to deranged climate cultists, many of whom continue to lament that inclement weather patterns must be a result of global warming. Even though the evidence is questionable, at best, in defense of such claims, climate apologists insist that the polar ice caps are melting, polar bears are starving, and everyone is going to die in 12 years unless global climate taxes are forcibly implemented.

    “They were predicting that by 2015 – four years ago – milk would be 13 bucks a gallon, and gasoline over $9 a gallon,” Limbaugh joked on his show. “The video effects show Manhattan half underwater. They show very little of Miami left.”

    Perhaps the worst part of the promo are the clips of young people stating that they’re ‘scared as hell right now’ about climate change destroying the planet – these extreme symptoms of climate anxiety stemming from the doomsday nonsense constantly being peddled by climate lunatics like those featured in the promo.

    “One can hope that suddenly society will realize that it’s on a doomsday path, and change very rapidly,” one such lunatic is heard stating in the video.

    “If we continue on the business-as-usual trajectory, there will be a tipping point that we cannot avert. We will indeed drive the car over the cliff,” laments another.

    “We have a chance to get it right. We have a chance to move in the right direction now. I don’t think that window of opportunity is going to be open for very long,” states yet another, hilariously using infomercial Act Now! marketing tactics to sell the idea to viewers that we must do something immediately if we’re to even have a chance at surviving global warming.

    More than a decade has since passed, however, and we’re all still here – along with the ice caps and polar bears that these crazies said should already have been long gone. Who’s in denial now?

    For more stories like this, be sure to check out ClimateScienceNews.com.

    Sources for this article include:

    This content was originally published here.

  • Quantum Computers Could Be True Randomness Generators | WIRED

    Quantum Computers Could Be True Randomness Generators | WIRED

    Say the words “quantum supremacy” at a gathering of computer scientists, and eyes will likely roll. The phrase refers to the idea that quantum computers will soon cross a threshold where they’ll perform with relative ease tasks that are extremely hard for classical computers. Until recently, these tasks were thought to have little real-world use, hence the eye rolls.

    Quanta Magazine

    Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research develop­ments and trends in mathe­matics and the physical and life sciences.

    But now that Google’s quantum processor is rumored to be close to reaching this goal, imminent quantum supremacy may turn out to have an important application after all: generating pure randomness.

    Randomness is crucial for almost everything we do with our computational and communications infrastructure. In particular, it’s used to encrypt data, protecting everything from mundane conversations to financial transactions to state secrets.

    Genuine, verifiable randomness—think of it as the property possessed by a sequence of numbers that makes it impossible to predict the next number in the sequence—is extremely hard to come by.

    That could change once quantum computers demonstrate their superiority. Those first tasks, initially intended to simply show off the technology’s prowess, could also produce true, certified randomness. “We are really excited about it,” said John Martinis, a physicist at the University of California, Santa Barbara, who heads Google’s quantum computing efforts. “We are hoping that this is the first application of a quantum computer.”

    Randomness and Entropy

    Randomness and quantum theory go together like thunder and lightning. In both cases, the former is an unavoidable consequence of the latter. In the quantum world, systems are often said to be in a combination of states—in a so-called “superposition.” When you measure the system, it will “collapse” into just one of those states. And while quantum theory allows you to calculate probabilities for what you’ll find when you do your measurement, the particular result is always fundamentally random.

    Physicists have been exploiting this connection to create random-number generators. These all rely on measurements of some kind of quantum superposition. And while these systems are more than sufficient for most people’s randomness needs, they can be hard to work with. In addition, it’s extremely difficult to prove to a skeptic that these random-number generators really are random. And finally, some of the most effective methods for generating verifiable randomness require finicky setups with multiple devices separated by great distances.

    The Google AI lab introduced a 72-qubit quantum processor called Bristlecone in 2018.

    One recent proposal for how to pull randomness out of a single device—a quantum computer—exploits a so-called sampling task, which will be among the first tests of quantum supremacy. To understand the task, imagine you are given a box filled with tiles. Each tile has a few 1s and 0s etched onto it—000, 010, 101 and so on.

    If there are just three bits, there are eight possible options. But there can be multiple copies of each labeled tile in the box. There might be 50 tiles labeled 010 and 25 labeled 001. This distribution of tiles determines the likelihood that you’ll randomly pull out a certain tile. In this case, you’re twice as likely to pull out a tile labeled 010 as you are to pull out a tile labeled 001.

    LEARN MORE

    The WIRED Guide to Quantum Computing

    A sampling task involves a computer algorithm that does the equivalent of reaching into a box with a certain distribution of tiles and randomly extracting one of them. The higher the probability specified for any tile in the distribution, the more likely it is that the algorithm will output that tile.

    Of course, an algorithm isn’t going to reach into a literal bag and pull out tiles. Instead, it will randomly output a binary number that’s, say, 50 bits long, after being given a distribution that specifies the desired probability for each possible 50-bit output string.

    For a classical computer, the task becomes exponentially harder as the number of bits in the string gets larger. But for a quantum computer, the task is expected to remain relatively straightforward, whether it involves five bits or 50.

    The quantum computer starts with all its quantum bits—qubits—in a certain state. Let’s say they all start at 0. Just as classical computers act on classical bits using so-called logic gates, quantum computers manipulate qubits using the quantum equivalent, known as quantum gates.

    But quantum gates can put qubits into strange states. For example, one kind of gate can put a qubit that starts with an initial value 0 into a superposition of 0 and 1. If you were to then measure the state of the qubit, it would collapse randomly into either 0 or 1 with equal probability.

    Even more bizarrely, quantum gates that act on two or more qubits at once can cause the qubits to become “entangled” with each other. In this case, the states of the qubits become intertwined, so that the qubits can now only be described using a single quantum state.

    If you put a bunch of quantum gates together, then have them act on a set of qubits in some specified sequence, you’ve created a quantum circuit. In our case, to randomly output a 50-bit string, you can build a quantum circuit that puts 50 qubits, taken together, into a superposition of states that captures the distribution you’d like to re-create.

    When the qubits are measured, the entire superposition will collapse randomly to one 50-bit string. The probability that it’ll collapse to any given string is dictated by the distribution that is specified by the quantum circuit. Measuring the qubits is akin to reaching blindfolded into the box and randomly sampling one string from the distribution.

    Scott Aaronson, a computer scientist at the University of Texas, Austin, says that random number generation will probably be “the first application of quantum computers that will be technologically feasible to implement.”
    Computer Science Department, University of Texas at Austin

    How does this get us to random numbers? Crucially, the 50-bit string sampled by the quantum computer will have a lot of entropy, a measure of disorder or unpredictability, and hence randomness. “This might actually be kind of a big deal,” said Scott Aaronson, a computer scientist at the University of Texas, Austin, who came up with the new protocol. “Not because it’s the most important application of quantum computers—I think it’s far from that—rather, because it looks like probably the first application of quantum computers that will be technologically feasible to implement.”

    Aaronson’s protocol to generate randomness is fairly straightforward. A classical computer first gathers a few bits of randomness from some trusted source and uses this “seed randomness” to generate the description of a quantum circuit. The random bits determine the types of quantum gates and the sequence in which they should act on the qubits. The classical computer sends the description to the quantum computer, which implements the quantum circuit, measures the qubits, and sends back the 50-bit output bit string. In doing so, it has randomly sampled from the distribution specified by the circuit.

    Now repeat the process over and over—for example, 10 times for each quantum circuit. The classical computer uses statistical tests to ensure that the output strings have a lot of entropy. Aaronson has shown, partly in work published with Lijie Chen and partly in work yet to be published, that under certain plausible assumptions that such problems are computationally hard, no classical computer can generate such entropy in anywhere near the time it would take a quantum computer to randomly sample from a distribution. After the checks, the classical computer pastes together all the 50-bit output strings and feeds it all to a well-known classical algorithm. “It produces a long string that is nearly perfectly random,” Aaronson said.

    The Quantum Trapdoor

    Aaronson’s protocol is best suited for quantum computers with about 50 to 100 qubits. As the number of qubits in a quantum computer passes this threshold, it becomes computationally intractable for even classical supercomputers to use the protocol. This is where another scheme for generating verifiable randomness using quantum computers enters the picture. It uses an existing mathematical technique with a forbidding name: a trapdoor claw-free function. “It sounds much worse than it is,” said Umesh Vazirani, a computer scientist at the University of California, Berkeley, who devised the new strategy along with Zvika Brakerski, Paul Christiano, Urmila Mahadev and Thomas Vidick.

    Imagine a box again. Instead of reaching in and extracting a string, this time you drop in an n-bit string, call it x, and out pops another n-bit string. The box is somehow mapping an input string to an output string. But the box has a special property: For every x, there is another input string y that generates the same output string.

    In other words, there exist two unique input strings—x and y—for which the box returns the same output string, z. This triplet of x, y and z is called a claw. The box, in computer science speak, is a function. The function is easy to compute, meaning that given x or y, it’s easy to calculate z. But if you are only given x and z, finding y—and hence the claw—is impossible, even for a quantum computer.

    Urmila Mahadev, Umesh Vazirani and Thomas Vidick (from left) developed a random number generator by linking cryptography with quantum information processing.
    Jana Asenbrennerova/Quanta Magazine; Simons Institute for the Theory of Computing; Courtesy of Caltech

    The only way you could get at the claw is if you had some inside information, the so-called trapdoor.

    Vazirani and his colleagues want to use such functions not only to get quantum computers to generate randomness, but to verify that the quantum computer is behaving, well, quantum mechanically—which is essential to trusting the randomness.

    The protocol starts with a quantum computer that puts n qubits into a superposition of all n-bit strings. Then a classical computer sends over a description of a quantum circuit specifying the function to be applied to the superposition—a trapdoor claw-free function. The quantum computer implements the circuit, but without knowing anything about the trapdoor.

    At this stage, the quantum computer enters a state in which one set of its qubits is in a superposition of all n-bit strings, while another set holds the result of applying the function to this superposition. The two sets of qubits are entangled with each other.

    The quantum computer then measures the second set of qubits, randomly collapsing the superposition into some output z. The first set of qubits, however, collapses into an equal superposition of two n-bit strings, x and y, because either could have served as input to the function that led to z.

    The classical computer receives the output z, then does one of two things. Most of the time, it asks the quantum computer to measure its remaining qubits. This will collapse the superposition, with a 50-50 chance, into either x or y. That’s equivalent to getting a 0 or a 1, randomly.

    Occasionally, to check on the quantum computer’s quantumness, the classical computer asks for a special measurement. The measurement and its outcome are designed so that the classical computer, with the help of the trapdoor that only it has access to, can ensure that the device answering its queries is indeed quantum. Vazirani and colleagues have shown that if the device gives the correct answer to the special measurement without using collapsing qubits, that’s equivalent to figuring out the claw without using the trapdoor. This, of course, is impossible. So there must be at least one qubit collapsing inside the device (providing, randomly, a 0 or a 1). “[The protocol] is creating a tamper-proof qubit inside an untrusted quantum computer,” Vazirani said.

    This scheme might be faster than Aaronson’s quantum sampling protocol, but it has a distinct disadvantage. “It’s not going to be practical with 50 or 70 qubits,” Aaronson said.

    Aaronson, for now, is waiting for Google’s system. “Whether the thing they are going to roll out is going to be actually good enough to achieve quantum supremacy is a big question,” he said.

    If it is, then verifiable quantum randomness from a single quantum device is around the corner. “We think it’s useful and a potential market, and that’s something we want to think about offering to people,” Martinis said.

    Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

    More Great WIRED Stories

    This content was originally published here.

  • Younger generations are growing horns in the back of their head

    Younger generations are growing horns in the back of their head

    Younger generations seem to be developing horns in the back of their skulls due to the extended use of technology like smartphones and tablets.

    Two Australian researchers made the bizarre discovery while examining hundreds of X-rays of people aged between 18 and 30, finding almost half had developed bone growths.

    They’re the kind of spurs normally seen in hunched-over elderly people who’ve subjected their bodies to long-term poor posture and significant stress loads on their bones.

    But the presence of the “horn-like” skull growths raise serious concerns about what extended use of phones is doing to young people’s bodies.

    The findings by Dr David Shahar and Associate Professor Mark Sayers at The University of the Sunshine Coast flew under the radar when they were published at the end of last year, two years after their initial warning about the trend.

    But a BBC article last week about how tech is changing the human body cited their research and saw an explosion in interest in the work.

    These kinds of growths are typically found in older people and result from long-term stress on the skeleton and, until the advent of gadget technology, were rarely found in young people.

    “This is evidence that musculoskeletal degenerative processes can start and progress silently from an early age,” Dr Shahar said.

    “These findings were surprising because typically they take years to develop and are more likely to be seen in the ageing population.

    “It is important to understand that, in most cases, bone spurs measure a few single millimetres and yet we found projections of 10 to 30 millimetres in the studied young population.”

    One of the X-rays showing the horn.

    One of the X-rays showing the horn.Source:Supplied

    This particular growth measured more than two centimetres in length.

    This particular growth measured more than two centimetres in length.Source:Supplied

    The findings offer a warning about the impact of poor posture, especially in young people, due to extended phone and gadget use.

    source

  • Freaky robot fish powered by ‘blood’ and goes days ‘without eating’

    Freaky robot fish powered by ‘blood’ and goes days ‘without eating’

    Engineering and Applied Mechanics, said: “We realised that the operation time of most robots is very short before they have to recharge, on the order of tens of minutes, yet humans can operate for days without eating.

    “We wanted to solve this problem by finding ways to store energy in all the components of a robot.

    “This robot blood is our first demonstration of storing energy in a fluid that is normally only used for actuation.”

    Pikul added: “As the fluid is pumped through the fish robot, the moving fluid causes the robot to move. The vascular system, therefore, is multifunctional. It is these multiple functions that allow the robot to maintain it’s dexterity while also having a lone operational time.”

    The scientist and his team came up with the idea for the robot while brainstorming new ways of making robots more independent.

    He said: “We realized that the operation time of most robots is very short before they have to recharge, on the order of tens of minutes, yet humans can operate for days without eating.”

    In tests, the robotic fish was able to work for eight times longer than traditional robots.

    It is able to swim for long durations, against a current, and operate at a rate of 1.5 body lengths per minute, with a maximum of 36 hours.

    source

  • World’s first roller coaster at sea

    World’s first roller coaster at sea

    A new cruise ship for Carnival cruise line will feature the world’s first roller coaster at sea.  The BOLT: Ultimate Sea Coaster will debut in 2020 when the Mardi Gras hits the high seas for the first time, the company announced.  The roller coaster contains 800 feet of track reaching speeds of nearly 40 mph.  It will give riders 360-degree views of the water 187 feet above the ocean Riders can control how fast they go.  The cruise line indicated that it will be an additional fee for guests to take a ride on the new coaster. Among the other features on the ship will be the Big Chicken restaurant created by Carnival Chief Fun Officer and NBA Hall of Famer Shaquille O’Neal. “Big Chicken is a labor of love featuring all of my favorite fried chicken recipes developed in tandem with my mom,” said O’Neal. “Carnival is a great partner, and I am very excited to bring the largest Big Chicken at sea aboard the spectacular new Mardi Gras.” The Carnival Mardi Gras will set sail for a  series of voyages from New York then it’s off  to Port Canaveral for cruises to the Caribbean beginning in Oct. 2020.

    source

     

  • The Quantum Menace

    The Quantum Menace

    The Quantum Menace

    The Quantum Menace

    Over the last few decades, the word ‘quantum’ has become increasingly popular. It is common to find articles, reports, and many people interested in quantum mechanics and the new capabilities and improvements it brings to the scientific community. This topic not only concerns physics, since the development of quantum mechanics impacts on several other fields such as chemistry, economics, artificial intelligence, operations research, and undoubtedly, cryptography.

    This post begins a trio of blogs describing the impact of quantum computing on cryptography, and how to use stronger algorithms resistant to the power of quantum computing.

    • This post introduces quantum computing and describes the main aspects of this new computing model and its devastating impact on security standards; it summarizes some approaches to securing information using quantum-resistant algorithms.

    All of this is part of Cloudflare’s Crypto Week 2019, now fasten your seatbelt and get ready to make a quantum leap.

    What is Quantum Computing?

    Back in 1981, Richard Feynman raised the question about what kind of computers can be used to simulate physics. However, some physical phenomena, such as quantum mechanics, cannot be simulated using a classical computer. Then, he conjectured the existence of a computer model that behaves under quantum mechanics rules, which opened a field of research now called quantum computing. To understand the basics of quantum computing, it is necessary to recall how classical computers work, and from that shine a spotlight on the differences between these computational models.

    The Quantum Menace
    Fellows of the Royal Society: John Maynard Smith, Richard Feynman & Alan Turing

    In 1936, Alan Turing and Emil Post independently described models that gave rise to the foundation of the computing model known as the Post-Turing machine, which describes how computers work and allowed further determination of limits for solving problems.

    In this model, the units of information are bits, which store one of two possible values, usually denoted by 0 and 1. A computing machine contains a set of bits and performs operations that modify the values of the bits, also known as the machine’s state. Thus, a machine with N bits can be in one of 2ᴺ possible states. With this in mind, the Post-Turing computing model can be abstractly described as a machine of states, in which running a program is translated as machine transitions along the set of states.

    A paper David Deutsch published in 1985 describes a computing model that extends the capabilities of a Turing machine based on the theory of quantum mechanics. This computing model introduces several advantages over the Turing model for processing large volumes of information. It also presents unique properties that deviate from the way we understand classical computing. Most of these properties come from the nature of quantum mechanics. We’re going to dive into these details before approaching the concept of quantum computing.

    Superposition

    One of the most exciting properties of quantum computing that provides an advantage over the classical computing model is superposition. In physics, superposition is the ability to produce valid states from the addition or superposition of several other states that are part of a system.

    Applying these concepts to computing information, it means that there is a system in which it is possible to generate a machine state that represents a (weighted) sum of the states 0 and 1; in this case, the term weighted means that the state can keep track of “the quantity of” 0 and 1 present in the state. In the classical computation model, one bit can only store either the state of 0 or 1, not both; even using two bits, they cannot represent the weighted sum of these states. Hence, to make a distinction from the basic states, quantum computing uses the concept of a quantum bit (qubit) — a unit of information to denote the superposition of two states. This is a cornerstone concept of quantum computing as it provides a way of tracking more than a single state per unit of information, making it a powerful tool for processing information.

    The Quantum Menace
    Classical computing – A bit stores only one of two possible states: ON or OFF.

    The Quantum Menace
    Quantum computing – A qubit stores a combination of two or more states.

    So, a qubit represents the sum of two parts: the 0 or 1 state plus the amount each 0/1 state contributes to produce the state of the qubit.

    In mathematical notation, qubit \( | \Psi \rangle \) is an explicit sum indicating that a qubit represents the superposition of the states 0 and 1. This is the Dirac notation used to describe the value of a qubit \( | \Psi \rangle =  A | 0 \rangle +B | 1 \rangle \), where, A and B are complex numbers known as the amplitude of the states 0 and 1, respectively. The value of the basic states is represented by qubits as \( | 0 \rangle =  1 | 0 \rangle + 0 | 1 \rangle \)  and \( | 1 \rangle =  0 | 0 \rangle + 1 | 1 \rangle \), respectively. The right side of the term contains the abbreviated notation for these special states.

    Measurement

    In a classical computer, the values 0 and 1 are implemented as digital signals. Measuring the current of the signal automatically reveals the status of a bit. This means that at any moment the value of the bit can be observed or measured.

    The state of a qubit is maintained in a physically closed system, meaning that the properties of the system, such as superposition, require no interaction with the environment; otherwise any interaction, like performing a measurement, can cause interference on the state of a qubit.

    Measuring a qubit is a probabilistic experiment. The result is a bit of information that depends on the state of the qubit. The bit, obtained by measuring \( | \Psi \rangle =  A | 0 \rangle +B | 1 \rangle \), will be equal to 0 with probability \( |A|^2 \),  and equal to 1 with probability \( |B|^2 \), where \( |x| \) represents the absolute value of \(x\).

    From Statistics, we know that the sum of probabilities of all possible events is always equal to 1, so it must hold that \( |A|^2 +|B|^2 =1 \). This last equation motivates to represent qubits as the points of a circle of radius one, and more generally, as the points on the surface of a sphere of radius one, which is known as the Bloch Sphere.

    The Quantum Menace
    The qubit state is analogous to a point on a unitary circle.

    The Quantum Menace
    The Bloch Sphere by Smite-Meister – Own work, CC BY-SA 3.0.

    Let’s break it down: If you measure a qubit you also destroy the superposition of the qubit, resulting in a superposition state collapse, where it assumes one of the basics states, providing your final result.

    Another way to think about superposition and measurement is through the coin tossing experiment.

    Toss a coin in the air and you give people a random choice between two options: heads or tails. Now, don’t focus on the randomness of the experiment, instead note that while the coin is rotating in the air, participants are uncertain which side will face up when the coin lands. Conversely, once the coin stops with a random side facing up, participants are 100% certain of the status.

    The Quantum Menace

    How does it relate? Qubits are similar to the participants. When a qubit is in a superposition of states, it is tracking the probability of heads or tails, which is the participants’ uncertainty quotient while the coin is in the air. However, once you start to measure the qubit to retrieve its value, the superposition vanishes, and a classical bit value sticks: heads or tails. Measurement is that moment when the coin is static with only one side facing up.

    A fair coin is a coin that is not biased. Each side (assume 0=heads and 1=tails) of a fair coin has the same probability of sticking after a measurement is performed. The qubit \( \tfrac{1}{\sqrt{2}}|0\rangle + \tfrac{1}{\sqrt{2}}|1\rangle \) describes the probabilities of tossing a fair coin. Note that squaring either of the amplitudes results in ½, indicating that there is a 50% chance either heads or tails sticks.

    It would be interesting to be able to charge a fair coin at will while it is in the air. Although this is the magic of a professional illusionist, this task, in fact, can be achieved by performing operations over qubits. So, get ready to become the next quantum magician!

    The Quantum Menace

    Quantum Gates

    A logic gate represents a Boolean function operating over a set of inputs (on the left) and producing an output (on the right). A logic circuit is a set of connected logic gates, a convenient way to represent bit operations.

    The Quantum Menace
    The NOT gate is a single-bit operation that flips the value of the input bit.

    Other gates are AND, OR, XOR, and NAND, and more. A set of gates is universal if it can generate other gates. For example, NOR and NAND gates are universal since any circuit can be constructed using only these gates.

    Quantum computing also admits a description using circuits. Quantum gates operate over qubits, modifying the superposition of the states. For example, there is a quantum gate analogous to the NOT gate, the X gate.

    The X quantum gate interchanges the amplitudes of the states of the input qubit.

    The Quantum Menace

    The Z quantum gate flips the sign’s amplitude of state 1:

    The Quantum Menace

    Another quantum gate is the Hadamard gate, which generates an equiprobable superposition of the basic states.

    The Quantum Menace

    Using our coin tossing analogy, the Hadamard gate has the action of tossing a fair coin to the air. In quantum circuits, a triangle represents measuring a qubit, and the resulting bit is indicated by a double-wire.

    The Quantum Menace

    Other gates, such as the CNOT gate, Pauli’s gates, Toffoli gate, Deutsch gate, are slightly more advanced. Quirk, the open-source playground, is a fun sandbox where you can construct quantum circuits using all of these gates.

    Reversibility

    An operation is reversible if there exists another operation that rolls back the output state to the initial state. For instance, a NOT gate is reversible since applying a second NOT gate recovers the initial input.

    The Quantum Menace

    In contrast, AND, OR, NAND gates are not reversible. This means that some classical computations cannot be reversed by a classic circuit that uses only the output bits. However, if you insert additional bits of information, the operation can be reversed.

    Quantum computing mainly focuses on reversible computations, because there’s always a way to construct a reversible circuit to perform an irreversible computation. The reversible version of a circuit could require the use of ancillary qubits as auxiliary (but not temporary) variables.

    Due to the nature of composed systems, it could be possible that these ancillas (extra qubits) correlate to qubits of the main computation. This correlation makes it infeasible to reuse ancillas since any modification could have the side-effect on the operation of a reversible circuit. This is like memory assigned to a process by the operating system: the process cannot use memory from other processes or it could cause memory corruption, and processes cannot release their assigned memory to other processes. You could use garbage collection mechanisms for ancillas, but performing reversible computations increases your qubit budget.

    Composed Systems

    In quantum mechanics, a single qubit can be described as a single closed system: a system that has no interaction with the environment nor other qubits. Letting qubits interact with others leads to a composed system where more states are represented. The state of a 2-qubit composite system is denoted as \(A_0|00\rangle+A_1|01\rangle+A_2|10\rangle+A_3|11\rangle \), where, \( A_i \) values correspond to the amplitudes of the four basic states 00, 01, 10, and 11. This qubit \( \tfrac{1}{2}|00\rangle+\tfrac{1}{2}|01\rangle+\tfrac{1}{2}|10\rangle+\tfrac{1}{2}|11\rangle \) represents the superposition of these basic states, both having the same probability obtained after measuring the two qubits.

    In the classical case, the state of N bits represents only one of 2ᴺ possible states, whereas a composed state of N qubits represents all the 2ᴺ states but in superposition. This is one big difference between these computing models as it carries two important properties: entanglement and quantum parallelism.

    Entanglement

    According to the theory behind quantum mechanics, some composed states can be described through the description of its constituents. However, there are composed states where no description is possible, known as entangled states.

    The Quantum Menace
    Bell states are entangled qubit examples

    The entanglement phenomenon was pointed out by Einstein, Podolsky, and Rosen in the so-called EPR paradox. Suppose there is a composed system of two entangled qubits, in which by performing a measurement in one qubit causes interference in the measurement of the second. This interference occurs even when qubits are separated by a long distance, which means that some information transfer happens faster than the speed of light. This is how quantum entanglement conflicts with the theory of relativity, where information cannot travel faster than the speed of light. The EPR paradox motivated further investigation for deriving new interpretations about quantum mechanics and aiming to resolve the paradox.

    Quantum entanglement can help to transfer information at a distance by following a communication protocol. The following protocol examples rely on the fact that Alice and Bob separately possess one of two entangled qubits:

    • The superdense coding protocol allows Alice to communicate a 2-bit message \(m_0,m_1\) to Bob using a quantum communication channel, for example, using fiber optics to transmit photons. All Alice has to do is operate on her qubit according to the value of the message and send the resulting qubit to Bob. Once Bob receives the qubit, he measures both qubits, noting that the collapsed 2-bit state corresponds to Alice’s message.

    The Quantum Menace
    Superdense coding protocol.

    • The quantum teleportation protocol allows Alice to transmit a qubit to Bob without using a quantum communication channel. Alice measures the qubit to send Bob and her entangled qubit resulting in two bits. Alice sends these bits to Bob, who operates on his entangled qubit according to the bits received and notes that the result state matches the original state of Alice’s qubit.

    The Quantum Menace
    Quantum teleportation protocol.

    Quantum Parallelism

    Composed systems of qubits allow representation of more information per composed state. Note that operating on a composed state of N qubits is equivalent to operating over a set of 2ᴺ states in superposition. This procedure is quantum parallelism. In this setting, operating over a large volume of information gives the intuition of performing operations in parallel, like in the parallel computing paradigm; one big caveat is that superposition is not equivalent to parallelism.

    Remember that a composed state is a superposition of several states so, a computation that takes a composed state of inputs will result in a composed state of outputs. The main divergence between classical and quantum parallelism is that quantum parallelism can obtain only one of the processed outputs. Observe that a measurement in the output of a composed state causes that the qubits collapse to only one of the outputs, making it unattainable to calculate all computed values.

    The Quantum Menace

    Although quantum parallelism does not match precisely with the traditional notion of parallel computing, you can still leverage this computational power to get related information.

    Deutsch-Jozsa Problem: Assume \(F\) is a function that takes as input N bits, outputs one bit, and is either constant (always outputs the same value for all inputs) or balanced (outputs 0 for half of the inputs and 1 for the other half). The problem is to determine if \(F\) is constant or balanced.

    The quantum algorithm that solves the Deutsch-Jozsa problem uses quantum parallelism. First, N qubits are initialized in a superposition of 2ᴺ states. Then, in a single shot, it evaluates \(F\) for all of these states.

    The Quantum Menace
    (note that some factors were omitted for simplicity)

    The result of applying \(F\) appears (in the exponent) of the amplitude of the all-zero state. Note that only when \(F\) is constant is this amplitude, either +1 or -1. If the result of measuring the N qubit is an all-zeros bitstring, then there is a 100% certainty that \(F\) is constant. Any other result indicates that \(F\) is balanced.

    A deterministic classical algorithm solves this problem using \( 2^{N-1}+1\) evaluations of \(F\) in the worst case. Meanwhile, the quantum algorithm requires only one evaluation. The Deutsch-Jozsa problem exemplifies the exponential advantage of a quantum algorithm over classical algorithms.

    Quantum Computers

    The theory of quantum computing is supported by investigations in the field of quantum mechanics. However, constructing a quantum machine requires a physical system that allows representing qubits and manipulation of states in a reliable and precise way.

    The DiVincenzo Criteria require that a physical implementation of a quantum computer must:

    1. Be scalable and have well-defined qubits.
    2. Be able to initialize qubits to a state.
    3. Have long decoherence times to apply quantum error-correcting codes. Decoherence of a qubit happens when the qubit interacts with the environment, for example, when a measurement is performed.
    4. Use a universal set of quantum gates.
    5. Be able to measure single qubits without modifying others.

    Quantum computer physical implementations face huge engineering obstacles to satisfy these requirements. The most important challenge is to guarantee low error rates during computation and measurement. Lowering these rates require techniques for error correction, which add a significant number of qubits specialized on this task. For this reason, the number of qubits of a quantum computer should not be regarded as for classical systems. In a classical computer, the bits of a computer are all effective for performing a calculation, whereas the number of qubits is the sum of the effective qubits (those used to make calculations) plus the ancillas (used for reversible computations) plus the error correction qubits.

    Current implementations of quantum computers partially satisfy the DiVincenzo criteria. Quantum adiabatic computers fit in this category since they do not operate using quantum gates. For this reason, they are not considered to be universal quantum computers.

    Quantum Adiabatic Computers

    A recurrent problem in optimization is to find the global minimum of an objective function. For example, a route-traffic control system can be modeled as a function that reduces the cost of routing to a minimum. Simulated annealing is a heuristic procedure that provides a good solution to these types of problems. Simulated annealing finds the solution state by slowly introducing changes (the adiabatic process) on the variables that govern the system.

    Quantum annealing is the analogous quantum version of simulated annealing. A qubit is initialized into a superposition of states representing all possible solutions to the problem. Here is used the Hamiltonian operator, which is the sum of vectors of potential and kinetic energies of the system. Hence, the objective function is encoded using this operator describing the evolution of the system in correspondence with time. Then, if the system is allowed to evolve very slowly, it will eventually land on a final state representing the optimal value of the objective function.

    Currently, there exist adiabatic computers in the market, such as the D-Wave and IBM Q systems, featuring hundreds of qubits; however, their capabilities are somewhat limited to some problems that can be modeled as optimization problems. The limits of adiabatic computers were studied by van Dam et al, showing that despite solving local searching problems and even some instances of the max-SAT problem, there exists harder searching problems this computing model cannot efficiently solve.

    Nuclear Magnetic Resonance

    Nuclear Magnetic Resonance (NMR) is a physical phenomena that can be used to represent qubits. The spin of atomic nucleus of molecules is perturbed by an oscillating magnetic field. A 2001 report describes successful implementation of Shor’s algorithm in a 7-qubit NMR quantum computer. An iconic result since this computer was able to factor the number 15.

    The Quantum Menace
    Nucleus spinning induced by a magnetic field, Darekk2CC BY-SA 3.0

    The Quantum Menace
    NRM Spectrometer by UCSB

    Superconducting Quantum Computers

    One way to physically construct qubits is based on superconductors, materials that conduct electric current with zero resistance when exposed to temperatures close to absolute zero.

    The Quantum Menace

    The Josephson effect, in which current flows across the junction of two superconductors separated by a non-superconducting material, is used to physically implement a superposition of states.

    The Quantum Menace
    A Josephson junction – Public Domain

    When a magnetic flux is applied to this junction, the current flows continuously in one direction. But, depending on the quantity of magnetic flux applied, the current can also flow in the opposite direction. There exists a quantum superposition of currents going both clockwise and counterclockwise leading to a physical implementation of a qubit called flux qubit. The complete device is known as the Superconducting Quantum Interference Device (SQUID) and can be easily coupled scaling the number of qubits. Thus, SQUIDs are like the transistors of a quantum computer.

    The Quantum Menace
    SQUID: Superconducting Quantum Interference Device. Image by Kurzweil Network and original source.

    Examples of superconducting computers are:

    The Quantum Menace
    D-Wave Cooling System by D-Wave Systems Inc.

    IBM Q System

    The Quantum Menace
    IBM Q System One cryostat at CES.

    The Imminent Threat of Quantum Algorithms

    The quantum zoo website tracks problems that can be solved using quantum algorithms. As of mid-2018, more than 60 problems appear on this list, targeting diverse applications in the area of number theory, approximation, simulation, and searching. As terrific as it sounds, some easily-solvable problems by quantum computing are surrounding the security of information.

    Grover’s Algorithm

    Tales of a quantum detective (fragment). A couple of detectives have the mission of finding one culprit in a group of suspects that always respond to this question honestly: “are you guilty?”.
    The detective C follows a classic interrogative method and interviews every person one at a time, until finding the first one that confesses.
    The detective Q proceeds in a different way, First gather all suspects in a completely dark room, and after that, the detective Q asks them — are you guilty? — A steady sound comes from the room saying “No!” while at the same time, a single voice mixed in the air responds “Yes!.” Since everybody is submerged in darkness, the detective cannot see the culprit. However, detective Q knows that, as long as the interrogation advances, the culprit will feel desperate and start to speak louder and louder, and so, he continues asking the same question. Suddenly, detective Q turns on the lights, enters into the room, and captures the culprit. How did he do it?

    The task of the detective can be modeled as a searching problem. Given a Boolean function \( f\) that takes N bits and produces one bit, to find the unique input \(x\) such that \( f(x)=1\).

    A classical algorithm (detective C) finds \(x\) using \(2^N-1\) function evaluations in the worst case. However, the quantum algorithm devised by Grover, corresponding to detective Q, searches quadratically faster using around \(2^{N/2}\) function evaluations.

    The key intuition of Grover’s algorithm is increasing the amplitude of the state that represents the solution while maintaining the other states in a lower amplitude. In this way, a system of N qubits, which is a superposition of 2ᴺ possible inputs, can be continuously updated using this intuition until the solution state has an amplitude closer to 1. Hence, after updating the qubits many times, there will be a high probability to measure the solution state.

    Initially, a superposition of 2ᴺ states (horizontal axis) is set, each state has an amplitude (vertical axis) close to 0. The qubits are updated so that the amplitude of the solution state increases more than the amplitude of other states. By repeating the update step, the amplitude of the solution state gets closer to 1, which boosts the probability of collapsing to the solution state after measuring.

    The Quantum Menace
    Image taken from D. Bernstein’s .

    Grover’s Algorithm (pseudo-code):

    1. Prepare an N qubit \(|x\rangle \) as a uniform superposition of 2ᴺ states.
    2. Update the qubits by performing this core operation. $$ |x\rangle \mapsto (-1)^{f(x)} |x\rangle $$ The result of \( f(x) \) only flips the amplitude of the searched state.
    3. Negate the N qubit over the average of the amplitudes.
    4. Repeat Step 2 and 3 for \( (\tfrac{\pi}{4})  2^{ N/2} \) times.
    5. Measure the qubit and return the bits obtained.

    Alternatively, the second step can be better understood as a conditional statement:

    IF f(x) = 1 THEN
         Negate the amplitude of the solution state.
    ELSE
         /* nothing */
    ENDIF
    

    Grover’s algorithm considers function \(f\) a black box, so with slight modifications, the algorithm can also be used to find collisions on the function. This implies that Grover’s algorithm can find a collision using an asymptotically less number of operations than using a brute-force algorithm.

    The power of Grover’s algorithm can be turned against cryptographic hash functions. For instance, a quantum computer running Grover’s algorithm could find a collision on SHA256 performing only 2¹²⁸ evaluations of a reversible circuit of SHA256. The natural protection for hash functions is to increase the output size to double. More generally, most of symmetric key encryption algorithms will survive to the power of Grover’s algorithm by doubling the size of keys.

    The scenario for public-key algorithms is devastating in face of Peter Shor’s algorithm.

    Shor’s Algorithm

    Multiplying integers is an easy task to accomplish, however, finding the factors that compose an integer is difficult. The integer factorization problem is to decompose a given integer number into its prime factors. For example, 42 has three factors 2, 3, and 7 since \( 2\times 3\times 7 = 42\). As the numbers get bigger, integer factorization becomes more difficult to solve, and the hardest instances of integer factorization are when the factors are only two different large primes. Thus, given an integer number \(N\), to find primes \(p\) and \(q\) such that \( N = p \times q\), is known as integer splitting.

    Factoring integers is like cutting wood, and the specific task of splitting integers is analogous to using an axe for splitting the log in two parts. There exist many different tools (algorithms) for accomplishing each task.

    The Quantum Menace

    For integer factorization, trial division, the Rho method, the elliptic curve method are common algorithms. Fermat’s method, the quadratic- and rational-sieve, leads to the (general) number field sieve (NFS) algorithm for integer splitting. The latter relies on finding a congruence of squares, that is, splitting \(N\) as a product of squares such that $$ N = x^2 – y^2 = (x+y)\times(x-y) $$ The complexity of NFS is mainly attached to the number of pairs \((x, y)\) that must be examined before getting a pair that factors \(N\). The NFS algorithm has subexponential complexity on the size of \(N\), meaning that the time required for splitting an integer increases significantly as the size of \(N\) grows. For large integers, the problem becomes intractable for classical computers.

    The Axe of Thor Shor

    The Quantum Menace
    Olaf Tryggvason – Public Domain

    The many different guesses of the NFS algorithm are analogous to hitting the log using a dulled axe; after subexponential many tries, the log is cut by half. However, using a sharper axe allows you to split the log faster. This sharpened axe is the quantum algorithm proposed by Shor in 1994.

    Let \(x\) be an integer less than \(N\) and of the order \(k\). Then, if \(k\) is even, there exists an integer \(q\) so \(qN\) can be factored as follows.

    The Quantum Menace

    This approach has some issues. For example, the factorization could correspond to \(q\) not \(N\) and the order of \(x\) is unknown, and here is where Shor’s algorithm enters the picture, finding the order of \(x\).

    The internals of Shor’s algorithm rely on encoding the order \(k\) into a periodic function, so that its period can be obtained using the quantum version of the Fourier transform (QFT). The order of \(x\) can be found using a polynomial number quantum evaluations of Shor’s algorithm. Therefore, splitting integers using this quantum approach has polynomial complexity on the size of \(N\).

    Shor’s algorithm carries strong implications on the security of the RSA encryption scheme because its security relies on integer factorization. A large-enough quantum computer can efficiently break RSA for current instances.

    Alternatively, one may recur to elliptic curves, used in cryptographic protocols like ECDSA or ECDH. Moreover, all TLS ciphersuites use a combination of elliptic curve groups, large prime groups, and RSA and DSA signatures. Unfortunately, these algorithms all succumb to Shor’s algorithm. It only takes a few modifications for Shor’s algorithm to solve the discrete logarithm problem on finite groups. This sounds like a catastrophic story where all of our encrypted data and privacy are no longer secure with the advent of a quantum computer, and in some sense this is true.

    On one hand, it is a fact that the quantum computers constructed as of 2019 are not large enough to run, for instance, Shor’s algorithm for the RSA key sizes used in standard protocols. For example, a 2018 report shows experiments on the factorization of a 19-bit number using 94 qubits, they also estimate that 147456 qubits are needed for factoring a 768-bit number. Hence, there numbers indicates that we are still far from breaking RSA.

    What if we increment RSA key sizes to be resistant to quantum algorithms, just like for symmetric algorithms?

    Bernstein et al. estimated that RSA public keys should be as large as 1 terabyte to maintain secure RSA even in the presence of quantum factoring algorithms. So, for public-key algorithms, increasing the size of keys does not help.

    A recent investigation by Gidney and Ekerá shows improvements that accelerate the evaluation of quantum factorization. In their report, the cost of factoring 2048-bit integers is estimated to take a few hours using a quantum machine of 20 million qubits, which is far from any current development. Something worth noting is that the number of qubits needed is two orders of magnitude smaller than the estimated numbers given in previous works developed in this decade. Under these estimates, current encryption algorithms will remain secure several more years; however, consider the following not-so-unrealistic situation.

    Information currently encrypted with for example, RSA, can be easily decrypted with a quantum computer in the future. Now, suppose that someone records encrypted information and stores them until a quantum computer is able to decrypt ciphertexts. Although this could be as far as 20 years from now, the forward-secrecy principle is violated. A 20-year gap to the future is sometimes difficult to imagine. So, let’s think backwards, what would happen if all you did on the Internet at the end of the 1990s can be revealed 20 years later — today. How does this impact the security of your personal information? What if the ciphertexts were company secrets or business deals? In 1999, most of us were concerned about the effects of the Y2K problem, now we’re facing Y2Q (years to quantum): the advent of quantum computers.

    Post-Quantum Cryptography

    Although the current capacity of the physical implementation of quantum computers is far from a real threat to secure communications, a transition to use stronger problems to protect information has already started. This wave emerged as post-quantum cryptography (PQC). The core idea of PQC is finding algorithms difficult enough that no quantum (and classical) algorithm can solve them.

    A recurrent question is: How does it look like a problem that even a quantum computer can not solve?

    These so-called quantum-resistant algorithms rely on different hard mathematical assumptions; some of them as old as RSA, others more recently proposed. For example, McEliece cryptosystem, formulated in the late 70s, relies on the hardness of decoding a linear code (in the sense of coding theory). The practical use of this cryptosystem didn’t become widespread, since with the passing of time, other cryptosystems superseded in efficiency. Fortunately, McEliece cryptosystem remains immune to Shor’s algorithm, gaining it relevance in the post-quantum era.

    Post-quantum cryptography presents alternatives:

    The Quantum Menace

    As of 2017, the NIST started an evaluation process that tracks possible alternatives for next-generation secure algorithms. From a practical perspective, all candidates present different trade-offs in implementation and usage. The time and space requirements are diverse; at this moment, it’s too early to define which will succeed RSA and elliptic curves. An initial round collected 70 algorithms for deploying key encapsulation mechanisms and digital signatures. As of early 2019, 28 of these survive and are currently in the analysis, investigation, and experimentation phase.

    Cloudflare’s mission is to help build a better Internet. As a proactive action, our cryptography team is preparing experiments on the deployment of post-quantum algorithms at Cloudflare scale. Watch our blog post for more details.

    This content was originally published here.