Physicists are not known for finesse. “Even if it cost us our funding,” I’ve heard a physicist declare, “we’d tell you what we think.” Little wonder I irked the porter who directed me toward central Cambridge.

The University of Cambridge consists of colleges as the US consists of states. Each college has a porter’s lodge, where visitors check in and students beg for help after locking their keys in their rooms. And where physicists ask for directions.

Last March, I ducked inside a porter’s lodge that bustled with deliveries. The woman behind the high wooden desk volunteered to help me, but I asked too many questions. By my fifth, her pointing at a map had devolved to jabbing.

*Read the subtext,* I told myself. *Leave.*

Or so I would have told myself, if not for that afternoon.

That afternoon, I’d visited Cambridge’s CMS, which merits every letter in “Centre for Mathematical Sciences.” Home to Isaac Newton’s intellectual offspring, the CMS consists of eight soaring, glass-walled, blue-topped pavilions. Their majesty walloped me as I turned off the road toward the gatehouse. So did the congratulatory letter from Queen Elizabeth II that decorated the route to the restroom.

I visited Nilanjana Datta, an affiliated lecturer of Cambridge’s Faculty of Mathematics, and her student, Felix Leditzky. Nilanjana and Felix specialize in entropies and one-shot information theory. Entropies quantify uncertainties and efficiencies. Imagine compressing many copies of a message into the smallest possible number of bits (units of memory). How few bits can you use per copy? That number, we call the *optimal* *compression rate*. It shrinks as the number of copies compressed grows. As the number of copies approaches infinity, that compression rate drops toward a number called the message’s *Shannon entropy*. If the message is quantum, the compression rate approaches the *von Neumann entropy*.

Good luck squeezing infinitely many copies of a message onto a hard drive. How efficiently can we compress fewer copies? According to *one-shot information theory*, the answer involves entropies other than Shannon’s and von Neumann’s. In addition to describing data compression, entropies describe the charging of batteries, the concentration of entanglement, the encrypting of messages, and other information-processing tasks.

Speaking of compressing messages: Suppose one-shot information theory posted status updates on Facebook. Suppose that that panel on your Facebook page’s right-hand side showed news weightier than celebrity marriages. The news feed might read, “TRENDING: One-shot information theory: Second-order asymptotics.”

Second-order asymptotics, I learned at the CMS, concerns how the optimal compression rate decays as the number of copies compressed grows. Imagine compressing a billion copies of a quantum message ρ. The number of bits needed about equals a billion times the von Neumann entropy *H*_{vN}(ρ). Since a billion is less than infinity, 1,000,000,000 *H*_{vN}(ρ) bits won’t suffice. Can we estimate the compression rate more precisely?

The question reminds me of gas stations’ hidden pennies. The last time I passed a station’s billboard, some number like $3.65 caught my eye. Each gallon cost about $3.65, just as each copy of ρ costs about *H*_{vN}(ρ) bits. But a 9/10, writ small, followed the $3.65. If I’d budgeted $3.65 per gallon, I couldn’t have filled my tank. If you budget *H*_{vN}(ρ) bits per copy of ρ, you can’t compress all your copies.

Suppose some station’s owner hatches a plan to promote business. If you buy one gallon, you pay $3.654. The more you purchase, the more the final digit drops from four. By cataloguing receipts, you calculate how a tank’s cost varies with the number of gallons, *n*. The cost equals $3.65 × *n* to a first approximation. To a second approximation, the cost might equal $3.65 × *n* + *a*√*n*, wherein *a* represents some number of cents. Compute *a*, and you’ll have computed the gas’s *second-order asymptotics*.

Nilanjana and Felix computed *a*’s associated with data compression and other quantum tasks. Second-order asymptotics met information theory when Strassen combined them in nonquantum problems. These problems developed under attention from Hayashi, Han, Polyanski, Poor, Verdu, and others. Tomamichel and Hayashi, as well as Li, introduced quantumness.

In the total-cost expression, $3.65 × *n *depends on *n* directly, or “linearly.” The second term depends on √*n*. As the number of gallons grows, so does √*n*, but √*n* grows more slowly than *n*. The second term is called “sublinear.”

Which is the word that rose to mind in the porter’s lodge. I told myself, *Read the sublinear text*.

Little wonder I irked the porter. At least—thanks to quantum information, my mistake, and facial expressions’ contagiousness—she smiled.

*With thanks to Nilanjana Datta and Felix Leditzky for explanations and references; to Nilanjana, Felix, and Cambridge’s Centre for Mathematical Sciences for their hospitality; and to porters everywhere for providing directions.*

Richard RothschildJuly 24, 2014 at 12:12 amAnd the article made me smile too!

Nicole Yunger HalpernJuly 24, 2014 at 4:13 amGlad to hear it!

John SidlesAugust 12, 2014 at 5:38 pmIf it should happen that

Quantum Frontierstakes requests, a post on the topic of quantum “unparticle” dynamics (arXiv:hep-ph/0703260) — regarded as a hindrance and/or help to quantum computing — would find an audience among mathematicians, scientists, and engineers (including me for sure, and perhaps Gil Kalai too).Generally speaking | Quantum FrontiersOctober 26, 2014 at 8:39 pm[…] Suppose you want to calculate more than the average extractable work. How much work will you probably extract during some particular trial? Though statistical physics offers no answer, resource theories do. One answer derived from resource theories resembles ∆F mathematically but involves one-shot information theory, which I’ve discussed elsewhere. […]