Books of 2024: Final Part

Gazettabyte has been asking industry figures to pick their reads of 2024. In the final part, Professor Polina Bayvel, Hojjat Salemi, Professor Laura Lechuga, and the editor of Gazettabyte share their selections.

Professor Polina Bayvel, Royal Society Research Professor & Head of the Optical Networks Group, Department of Electronic & Electrical Engineering, UCL

I recently attended a Royal Society Discussion Meeting where Leslie Valiant gave a brilliant talk on educability as a better definition than intelligence. A Harvard professor, he has developed many algorithms that underpin today’s networks, including Valiant’s load balancing. He is a profound thinker, and I wanted immediately to read his book, The Importance of Being Educable: A New Theory of Human Uniqueness.’

Although written in a popular style, it argues that educability (a precisely defined computational model) is a better term than intelligence, for which no agreed definition exists. He explains how we, as a human race, have been able to create the technological civilisation that we have and argues that this civilisation enabler is educability. He also implies that current AI models are not educable. The book is masterful in its lucidity in explaining complex concepts in computation. I really could not put it down.

Another read which has taken my breath away is A. N. Tolstoy’s The Road to Calvary (Russian: Хождение по мукам, romanised: Khozhdeniye po mukam, lit. ’Walking Through Torments’), also translated as Ordeal, is a trilogy set just before the Russian Revolution (starting 1914) and follows the lives of two sisters and their lovers/ husbands goes through the revolution and the Russian Civil War. It was a staple in Soviet schools, but leaving at age 12, I missed it and have only recently read it.

It’s a monument to history, and when one reads it, one realises that the well-to-do Russian liberals who argued for change and the removal of the Czarist rules had no idea what fate would face them or how their lives would change forever.

It made me think of today’s parallel – do we always understand the consequences of wanting liberal changes? The Russian pre-Revolution liberals, the intelligentsia, wanted democracy and more power for the people. What they got was the opposite – totalitarian oppression.

I was also struck by the stark realisation that had WWI not occurred, there would not have been a revolution, and the lives of so many people, including that of my own family, would have followed a completely different course.

Hojjat Salemi, Chief Business Development Officer, Ranovus

Several years ago, I decided to avoid social media platforms like Instagram and TikTok, as well as the news channels Fox News and CNN. I found them to be major distractions and wasteful of time.

I used the time instead to read and listen to author interviews (podcasts) on YouTube, which often provide deeper insights into why they wrote their books and their key ideas. One of the best decisions I’ve made is controlling what I watch on YouTube—without ads! If you’re looking for good books about technology, here are my recommendations:

The book that won the Financial Times Business Book of the Year for 2024 is Supremacy: AI, ChatGPT, and the Race that Will Change the World by Party Olson.

It offers a fascinating narrative starting in 2012, focusing on how AI systems have developed, with a spotlight on two main figures: Dennis Hassabis, co-founder of DeepMind, and Sam Altman, the co-founder of OpenAI.

The book explores three major themes:

  • how AI could reshape society as it grows increasingly intelligent,
  • the unintended consequences of the technologies we create,
  • and the moral dilemmas and risks of pushing these innovations too far. It’s a fast-paced, thought-provoking look at the future.

Another suggestion is Read Write Own: Building the Next Era of the Internet by Chris Dixon. The book is written clearly and engagingly and explains complex ideas like blockchain, NFTs, and decentralised networks. Dixon describes the evolution of the internet: the early days of reading information, the read-write era of social media where people shared but didn’t own content, and the emerging read-write-own era (Web3), where blockchain allows users to own digital assets.

While I’ve been thinking about decentralised networks a lot, I’m still not convinced they can take off, given our geopolitical challenges. Take Bitcoin, for example; if something goes wrong, who do you call? Moreover, Web3’s dominant players still rely on centralised computing power. It’s a thoughtful read, but only time will tell how Web3 unfolds.

Lastly, I recommend Ethics of Socially Disruptive Technologies: An Introduction. The book, available as a free PDF, is highly educational on how new technologies disrupt societal norms and ethical frameworks.

The book examines four specific technologies: social media, robots, climate engineering, and artificial wombs. For instance, social media was supposed to give everyone a voice and bring people together. Instead, it has often divided us, spread misinformation, and allowed foreign powers to interfere in elections. It challenges the idea of “government of the people, by the people, for the people” today. This book is perfect for anyone wanting to understand new technologies’ unintended consequences.

Professor Laura Lechuga, Head of the Nanobiosensors and Bioanalytical Application Group at the Catalan Institute of Nanoscience and Nanotechnology (ICN2).

I love reading and do it frequently, especially during the many work trips I take throughout the year.

My favourite reading of 2024 was Chip War: The Fight for the World’s Most Critical Technology by Chris Miller. It is an impressive book about the development of microelectronics and the pivotal role of chips in shaping the world powers.

Having a PhD focused on microelectronics, I enjoyed reading a book that will become a masterpiece. What I appreciated most were the personal stories of the brilliant scientists and engineers who conceived, developed, and solved all the technical obstacles to transforming the semiconductor industry that helped found some of the most influential companies in the world. This is a must-read book.

My second favourite book was The Maniac by Benjamin Labatut. The book is a combination of history and novel in which Labatut tells the story of brilliant physicists such as John von Neumann, a genius able to invent new fields. But the same prodigy whose work impacted future advances in computing terrified the people around him, and his personal life was miserable. The book describes the evolution of von Neumann’s work through to the battle between AI and a world champion player of the game Go. It is a book that reflects on the limits of technology, an original, addictive, and beautiful read.

Another book I loved in 2024 was Lessons in Chemistry by Bonnie Garmus.  It is a feminist novel about how difficult a professional career was for women scientists in the 1960s. I felt totally reflected in it, as our position has not changed much. It is a book that mixes funny and sad situations, is easy to read, very enjoyable, and has a clear message.

My last recommendation is the old Atlas Shrugged book by Ayn Rand. It isn’t easy to read due to its length but it is a fascinating futuristic story about a dystopian United States, and is now more actual than ever. It is a story of how human stupidity gains a significant advantage over intelligence and the devastating consequences for the U.S. This could also be extended to the rest of the world, perhaps a prophecy to be fulfilled in the coming years.

Roy Rubenstein, Editor of Gazettabyte

I read many books in 2024 and will highlight three. One is Strength in What Remains by Tracy Kidder. I had read his most recent book, Rough Sleepers: Dr. Jim O’Connell’s Urgent Mission to Bring Healing to Homeless People, and this was my follow-up read. Kidder is a master storyteller who finds the most remarkable individuals to write about. I highly recommend both.

Dame Hilary Mantel is best known for her Wolf Hall trilogy. Last year, a book of her writings—articles for literary magazines, essays, film reviews, and her BBC Reith Lectures—was published. A Memoir of My Former Self: A Life in Writing is an excellent read by a fabulous writer.

Lastly, I recommend the 55-hour audible version of Alexandre Dumas’s The Count of Monte Cristo. While listening, I walked past the local cinema and realised there was a 2024 film version being shown. I entered, showed the attendant the audible version and asked if the film was shorter.


Building the data rate out of smaller baud rates

Professor Andrew Lord

In the second article addressing the challenges of increasing the symbol rate of coherent optical transport systems, Professor Andrew Lord, BT’s head of optical network research, argues that the time is fast approaching to consider alternative approaches.

Coherent discourse 2

Coherent optical transport systems have advanced considerably in the last decade to cope with the relentless growth of internet traffic.

One-hundred-gigabit wavelengths, long the networking standard, have been replaced by 400-gigabit ones while state-of-the-art networks now use 800 gigabits.

Increasing the data carried by a single wavelength requires advancing the coherent digital signal processor (DSP), electronics and optics.

It also requires faster symbol rates.

Moving from 32 to 64 to 96 gigabaud (GBd) has increased the capacity of coherent transceivers from 100 to 800 gigabits.

Last year, Acacia, now part of Cisco, announced the first 1-terabit-plus wavelength coherent modem that uses a 128GBd symbol rate.

Other vendors will also be detailing their terabit coherent designs, perhaps as soon as the OFC show, to be held in San Diego in March.

The industry consensus is that 240GBd systems will be possible towards the end of this decade although all admit that achieving this target is a huge challenge.

Baud rate

Upping the baud rate delivers several benefits.

A higher baud rate increases the capacity of a single coherent transceiver while lowering the cost and power used to transport data. Simply put, operators get more bits for the buck by upgrading their coherent modems.

But some voices in the industry question the relentless pursuit of higher baud rates. One is Professor Andrew Lord, head of optical network research at BT.

“Higher baud rate isn’t necessarily a panacea,” says Lord. “There is probably a stopping point where there are other ways to crack this problem.”

Parallelism

Lord, who took part in a workshop at ECOC 2021 addressing whether 200+ GBd transmission systems are feasible, says he used his talk to get people to think about this continual thirst for higher and higher baud rates.

“I was asking the community, ‘Are you pushing this high baud rate because it is a competition to see who builds the biggest rate?’ because there are other ways of doing this,” says Lord.

One such approach is to adopt a parallel design, integrating two channels into a transceiver instead of pushing a single channel’s symbol rate.

“What is wrong with putting two lasers next to each other in my pluggable?” says Lord. “Why do I have to have one? Is that much cheaper?”

For an operator, what matters is the capacity rather than how that capacity is achieved.

Lord also argues that having a pluggable with two lasers gives an operator flexibility.

A single-laser transceiver can only go in one direction but with two, networking is possible. “The baud rate stops that, it’s just one laser so I can’t do any of that anymore,” says Lord.

The point is being reached, he says, where having two lasers, each at 100GBd, probably runs better than a single laser at 200GBd.

Excess capacity

Lord cites other issues arising from the use of ever-faster symbol rates.

What about links that don’t require the kind of capacity offered by very high baud rate transceivers?

If the link spans a short distance, it may be possibe to use a higher modulation scheme such as 32-ary quadrature amplitude modulation (32-QAM) or even 64-QAM. With a 200GBd symbol rate transceiver, that equates to a 3.2-terabit transceiver. “Yet what if I only need 100 gigabits,” says Lord.

One option is to turn down the data rate using, say, probabilistic constellation shaping. But then the high-symbol rate would still require a 200GHz channel. Baud rate equals spectrum, says Lord, and that would be wasting the fibre’s valuable spectrum.

Another solution would be to insert a different transceiver but that causes sparing issues for the operators.

Alternatively, the baud rate could be turned down. “But would operators do that?” says Lord. “If I buy a device capable of 200GBd, wouldn’t I always operate it at its maximum or would I turn it down because I want to save spectrum in some places?”

Turning the baud rate down also requires the freed spectrum to be used and that is an optical network management challenge.

“If I need to have to think about defragmenting the network, I don’t think operators will be very keen to do that,” says Lord.

Pushing electronics

Lord raises another challenge: the coherent DSP’s analogue-to-digital and digital-to-analogue converters.

Operating at a 200+ GBd symbol rate means the analogue-to-digital converters at the coherent receiver must operate at least at 200 giga-samples per second.

“You have to start sampling incredibly fast and that sampling doesn’t work very well,” says Lord. “It’s just hard to make the electronics work together and there will be penalties.”

Lord cites research work at UCL that suggests that the limitations of the electronics – and the converters in particular – are not negligible. Just connecting two transponders over a short piece of fibre shows a penalty.

“There shouldn’t be any penalty but there will be, and the higher the baud rate, you will get a penalty back-to-back because the electronics are not perfect,” he says.

He suspects the penalty is of the order of 1 or 2dB. That is a penalty lost to the system margin of the link before the optical transmission even starts.

Such loss is clearly unacceptable especially when considering how hard engineers are working to enhance algorithms for even a few tenths of a dB gain.

Lord expects that such compromised back-to-back performance will ultimately lead to the use of multiple adjacent carriers.

“Advertising the highest baudrate is obviously good for publicity and shows industry leadership,” he concludes. “But it does feel that we are approaching a limit for this, and then the way forward will be to build aggregate data rates out of smaller baud rates.”


ECOC 2009: Squeezing optics out of optical communications

Prof. Polina BayvelAn interview with Polina Bayvel, Professor of Optical Communications and Networks and head of the Optical Networks Group at University College London (UCL), on her ECOC conference impressions.

 

 

 

What did you find noteworthy at ECOC 2009?

PB: So much work on digital signal processing and coherent detection...will these techniques lead to another revolution in fibre optics?   But there is much to understand about how to design the DSP algorithms and how to best match these to appropriate fibre maps in some implementable way.

Did anything at the conference surprise you?

PB: Is there really a capacity crunch or is it a cost crunch and who will end up paying?  There is much work on new fibres, new DSP but why is no-one looking at new amplifiers?

What did you learn from ECOC?

PB: I learnt how little progress there has been made in all-optical networking - the well-trodden ideas and arguments on wavelength routing which have been circulating for over 15 years are not being taken up by operators but are being re-discovered and re-offered as new...and just how conservative the operators still are, except those in Japan.

Did  you see or hear anything that gives reason for industry optimism?

PB: Lots of buzz about linear and nonlinear DSP, error correcting codes, net coding gain, FPGAs and many other developments which, whilst invigorating the industry are squeezing optics out of optical communications.  Here is to the fightback for optics!


Privacy Preference Center