Raman Sundrum, a theorist from Johns Hopkins University, spoke next. He and Lisa Randall, from Harvard, are famous for inventing the Randall–Sundrum model, which attempts to resolve the weak interaction energy scale and Planck energy scale hierarchy problem. They introduced the idea that the universe consists of two four-dimensional “branes” that are connected by a five-dimensional “bulk.” These branes are part of the brane extension of superstring theory in which not only are there two-dimensional strings, but also there are higher dimensional branes. The standard model of particle physics inhabits one of these branes. The five-dimensional spacetime warps the four-dimensional spacetime in such a way that it reduces the tension between the energy of the electroweak scale at about 250 GeV and the Planck scale at 1019 GeV. This model has been studied extensively by the particle physics and string theory communities. It predicts the Kaluza–Klein particles in the five-dimensional scenario.
In his talk, Sundrum began by saying that, now that a new boson has been discovered at 125 GeV, which is consistent with the standard-model Higgs boson, we are left with serious “naturalness” problems—namely, the Higgs mass hierarchy problem, the severe fine-tuning of the vacuum energy density, and the electroweak energy scale and Planck scale fine-tuning problems. He said that these problems were painful for theorists to endure, and put his hand on his stomach as a gesture of pain.
The naturalness problem has been part of the history of the standard model since the inception of the idea that we need a Higgs boson to produce the masses of elementary particles and to make the theory renormalizable. Sundrum reviewed the usual attempts to resolve the Higgs mass hierarchy problem, including the MSSM, the Little Higgs solution, and composite models of the Higgs boson such as Technicolor. Theorists abhor any severe fine-tuning in their fundamental theories, and the fine-tuning problems he mentioned are quite severe and make the standard model unattractive unless a BSM scenario is discovered at the LHC. So far, the LHC has not discovered any hints of BSM physics (see Chapter 10 for a detailed discussion of the naturalness problem).
The rest of the workshop consisted mainly of discussion groups that took place in two adjacent seminar rooms at the Perimeter Institute, the Space Room and the Gravity Room. One of the discussion groups concentrated on experimental strategies for the LHC’s search for new exotic particles such as supersymmetric particles, whereas the other group concerned itself with theoretical issues such as the naturalness problem and searching for BSM physics.
I attended several of the experimentalists’ discussion sessions, and in one, which was organized by CERN experimentalist Albert De Roeck, I mentioned the importance of determining the spin and parity of the new boson, and discussed how this could be done by determining the angular distributions of the decay of the new boson into two Zs and their subsequent decay into four lep-tons. I emphasized that the spins of all the observed particles in the standard model were either spin 1 or spin ½. Because the new boson had been observed to decay into two photons, it could not have spin 1, because the two resulting photons each have spin 1; the new boson, then, must have either spin 0 or spin 2. I repeated Landsberg’s statement in his talk that a 3-sigma separation between the scalar and pseudoscalar parity assignments of the new boson could be reached when the LHC had accumulated enough data by running the machine at 30 inverse femtobarns of luminosity at 8 TeV later in the year. I emphasized that no collider had ever detected an elementary boson with the quantum numbers of the vacuum.
De Roeck nodded in agreement after my little lecture, and said that experimentalists at the ATLAS and CMS detectors were using methods similar to what I had described to determine the spin and parity of the new boson. He agreed that this was an important experimental issue to resolve before they could conclude that the new boson was the standard-model Higgs boson.
At one of the theory sessions that I attended, a speaker had covered the blackboard with little t’s, s’s, b’s, W’s, Z’s—all with tilde hats on them, signifying that they were “squarks,” “winos,” and “zinos,” the supersymmetric partners of the observed standard-model particles. A popular way to solve the Higgs mass hierarchy problem has always been based on this supersymmetric model. However, the 2011 and 2012 data from the LHC have not detected any super-symmetric particles up to higher and higher energies.
At a lunch break I spoke to João Varela, who is a senior experimentalist at the LHC. He explained how amazing the computer analysis is of the data pouring out of the CMS and ATLAS detectors. The data are sent out to a large grid of computers around the globe. The analyzed data are then sent back to CERN and stored in “parking” facilities. There is so much data being produced by the trillions of proton–proton collisions that these “parking lots” have to be used to avoid an extreme traffic jam of data. The parked data are stored and sorted daily by mechanical CERN robots. After the final data are analyzed, the results of the scattering experiments are available immediately to CERN experimentalists in real time.
An important factor in guaranteeing the results obtained truly represent what is happening in nature is the use of “blind analysis.” The data analysis goes through a very complicated software process involving statistical algorithms. Not until all the final analyses are completed is the conclusive result “opened.”
I asked João whether there was any danger of the huge data storage and the resulting analysis being hacked by computer hackers. I said, “They’ve hacked into the Pentagon and large financial institutions, so why shouldn’t they be able to hack into your system?”
He said yes, absolutely, attempts had been made to hack into the system but they were so far not successful. The way they avoid hacking is to isolate the data storage facilities at CERN from any outside influence.
The Perimeter Institute workshop indicated to me that the experimentalists connected with CERN were still proceeding cautiously, aiming to investigate the important attributes of the new boson before declaring it to be the standard-model Higgs. This is still in stark contrast to the media and physics blogs circus, where in many cases people have stated categorically that the Higgs boson has been discovered beyond doubt. For example, in his recent blog, “Not Even Wrong,” Peter Woit presented his take on the discovery of the Higgs boson at the LHC. “Last month came an announcement from Geneva that physicists of my generation have been anxiously awaiting since our student days nearly forty years ago,” he wrote, quoting from an article he had published in the Italian Left-wing newspaper Il Manifesto. “The Higgs particle showed up more or less exactly in the manner predicted by the so-called Standard Model…. We saw the equations of our textbooks dramatically confirmed.”
Yet perhaps some theorists are modulating their initial certainty about what the LHC has discovered. In his blog, “Of Particular Significance,” Matt Strassler, like Peter Woit, displayed a strong bias toward saying the Higgs boson had been discovered soon after the July 4 announcement. However, Strassler attended the workshop at the Perimeter Institute and wrote an article on his blog about it. He now talked about the “Higgs-like” boson, indicating that he had developed a more cautious approach to interpreting the new resonance.
SEPTEMBER 10, 2012
Stephen Hawking came to visit the Perimeter Institute for two weeks. This was after the inauguration of the new wing of the building, called the Stephen Hawking Centre. We were informed by our director, Neil Turok, that it was possible to meet Hawking and discuss research topics with him. He appeared at lunch with his nurse, seated as always in his supercomputer wheelchair. At lunch, I was joined by other Perimeter Institute colleagues and Jim Hartle, who had been invited from California to be present during Hawking’s visit to conduct some collaborative research with him.
I recalled that during a sabbatical leave at Cambridge in 1972, I arrived a couple of times at the Department of Applied Mathematics and Theoretical Physics in Silver Street and found Stephen sitting in his wheelchair, waiting for someone to turn up to help him up into the building, which I did. At
that time, 40 years ago, when he was about 30, Stephen was in much better shape physically and was able to talk to some degree. He was often seen whizzing around Cambridge in his motorized chair, a danger to himself and others as he reconnoitered the Cambridge traffic. This was before he acquired his remarkable electronic wheelchair, which he uses today.
Stephen was now almost completely paralyzed. He had a metal contraption attached to his right cheek. By twitching a muscle in his cheek, he was able to communicate with his computer and select letters on his screen, thereby forming sentences, which his electronic voice synthesizer read out.
I sought out Stephen and found him in an office set aside for him and his nurse. Jim Hartle was present in the office, together with another collaborator, both of them armed with clipboards and paper. I wondered how such a collaboration could proceed by only communicating with Stephen through his very curtailed means of dialogue. He sat facing the door as I walked in. I said, “Stephen, I’m John Moffat. You remember we have met in the past.” I thought I noted a glimmer of recognition in his eyes. I apologized to Jim Hartle for the interruption, explaining that I was leaving Waterloo the next day, and this was the only opportunity I had to see Stephen.
I approached Stephen and said that I wanted to discuss some research with him. I told him I was currently studying the properties of the new boson discovered at the LHC.
“Stephen, in my opinion it’s too early to say that the new boson is the standard-model Higgs boson,” I said. “For example, we do not yet know the spin or parity of the boson, and these are important properties that must be understood experimentally.”
Now I waited while Stephen formed a response to my statement. I watched as he selected the letters on his screen to form his sentence. This took several minutes. Eventually, the sentence was complete on his computer screen, and as I read it, his electronic voice said, “I hope it is not the Higgs boson.”
It was well known that Stephen had bet Gordon Kane, a professor at the University of Michigan at Ann Arbor, $100 that the LHC would not discover the standard-model Higgs boson. His opinion was based on work he had done on black holes. I said, “Stephen, it is too early for you to pay your $100 bet.”
The nurse seated nearby leaned forward and said, “Stephen! You have already paid that $100!”
I waited while Stephen began to form another response. I leaned over his shoulder and watched the sentence forming on his screen. Eventually I read, as the electronic voice spoke, “If it is not the Higgs, I will claim back my money.”
Then I said, “We have to wait for more data to determine important properties of the new boson. Hopefully, these data will be available by the end of the year or early next year.”
Then I waited again patiently while Stephen provided a longer response to my latest statement. On the screen the words came slowly: “We need more data and we must wait until they are available before we make a decision about the Higgs.”
I said, “Thank you, Stephen. Take care.” I left, nodding my thanks to the nurse. In the meantime, Jim Hartle and his collaborator had left the room and were working nearby.
OCTOBER 31, 2012
Blog sites in particle physics can keep one up to date on rumors about new LHC data and the general feeling in the physics community about what has been discovered. In particular, after CERN’s July 4 announcement of the discovery of the new boson, there was euphoria. The blogs went all the way from demanding that Peter Higgs and a choice of two others from the five living founders of the standard Higgs boson model be awarded a Nobel Prize immediately to cautionary statements that we did not know enough about the properties of the new boson to be sure that it was the Higgs boson. In the United Kingdom, blogs proclaimed that Peter Higgs should be knighted. On YouTube, Peter Higgs was shown choking up and becoming teary-eyed as he sat in the audience listening to the announcement of the new boson on July 4.
Since that announcement, a number of new papers have appeared on the electronic archive explaining how the critical measurements of the parity of the new boson could be performed at the LHC. To get a three-standard deviation (3-sigma) separation in the data between the scalar (positive) parity and the pseudoscalar (negative) parity distinction for the new boson would require a significant luminosity and new data, which hopefully could be obtained before the LHC shut down in February 2013. Many papers now referred to the new boson as the “X particle,” indicating that the authors were aware of the fact that we were not yet in a position to claim the discovery of the standard-model Higgs boson. The consensus among the theorists seemed to be that the new boson is indeed the standard-model Higgs boson, whereas the consensus among experimentalists was that it was a wait-and-see game, while we anticipate a more definitive statement about the identity of the new boson with the advent of new data.
It was becoming apparent to me that if the new LHC data confirmed that the new boson was a pseudoscalar particle, not a scalar elementary particle as required by the standard model, then this would have serious consequences for the standard model. Some particle theorists at the Perimeter Institute and elsewhere claim that we already know that the new boson is a scalar particle because the standard model requires it to be so! I have told them that I defer to experimental physics and nature to make decisions about the future of particle physics, and try not to be persuaded by theoretical prejudices. I have made it clear to the particle theorists that if the data determine that it is a pseudo scalar boson, then the coupling of the pseudoscalar Higgs boson to the W and Z bosons would be significantly different from the coupling of the elementary scalar boson to those bosons. A consequence of this is that the standard model would no longer be a renormalizable theory.
It should be understood that the current analysis of the parity of the new boson is model dependent. The experimentalists compare the standard scalar Higgs boson model to an “effective” phenomenological pseudoscalar Higgs boson model. The latter model cannot be consistent with the current data unless it is manipulated artificially. A true analysis for parity should compare the standard Higgs boson model with an alternative non-Higgs boson model, such as my composite quarkonium resonance.
A problem with the pseudoscalar Higgs boson model is that its coupling to the two Z bosons would lead to a decay rate into the eventual four leptons that would be more than an order of magnitude smaller than the result yielded by the scalar Higgs boson. This would destroy the possibility of fitting the pseudo-scalar Higgs decay into four leptons, which is one of the crucial golden decay channel results. This decay channel was critical in the CMS and ATLAS experiments, leading to the announcement of the discovery of the new boson. Future analysis of the data to determine the spin and parity of the new boson should eventually be able to avoid a model dependence.
The particle physics community is now waiting anxiously for new data to be “unblinded.” After the data are returned to CERN by the worldwide grid of computers, the data are blinded, or locked up until about two weeks before a new announcement is made. The fact that the computer analysts around the world do not know the overall results is analogous to a blinded study in medical research.
In actuality, the data are unblinded to a few analysts a week or two before the official unblinding of the data to set the parameters of the algorithms used to obtain the final results. Therefore, strictly speaking, there is some bias built into the unblinding of the data, which leads to an important issue regarding the ultimate veracity of the result of the data analysis. There can be two outcomes of the blinding and unblinding of the data: the experimental groups accept the result as it is conveyed to them through the statistical algorithms or they do not accept the result and manipulate it to remove perceived errors. The latter action can lead to human bias in interpreting the results. For example, certain perceived “outliers” in the data can be removed, according to information shared by experimentalists online, which changes the final result significantly. The many CMS and ATLAS analysts have meetings in private to discuss the results of th
e data analysis before any official announcements are made. These meetings are not open to the public or to other physicists, including those at CERN, who are not part of the inner circle of analysts. We must have confidence that the analysts make every attempt to prevent bias in analyzing the data and in deciding what to announce.
JANUARY 2013
The next presentation of new data from CERN after the July 4 seminar and press conference took place at a high-energy physics conference at the University of Kyoto on November 14, 2012. The new results reported at the Kyoto meeting were not significantly different from the previous data. I followed the talks, posted on the conference website, with great interest. A significant fact that emerged from the Kyoto talks was that the CMS group had not updated its data for the decay of the new boson into two photons. This was the cause of several rumors appearing on blogs regarding the status of the CMS two-photon decay results. However, the latest combination of data from the two golden channels determines the mass of the new boson to be 125.8 GeV. This measurement of the mass is claimed to be accurate to within half a percent. In contrast, the ATLAS determination of the mass of the new boson using these two golden channels was 126 GeV, also with an error of about half a percent.
Moreover, it was rumored on the physics blogs and from supposed leaks from the CMS collaboration that the signal strength for the two-photon decay channel result had decreased from its original 4-sigma value announced at the July 4 CERN seminar. However, these were just rumors, and there was no confirmation of their validity from CERN in Kyoto.
Cracking the Particle Code of the Universe Page 23