Home › Forum Online Discussion › General › Evidence suggests subatomic particles could defy the standard model (article)
- This topic has 4 replies, 3 voices, and was last updated 6 years, 1 month ago by c_howdy.
-
AuthorPosts
-
April 25, 2018 at 1:03 pm #51976c_howdyParticipant
August 27, 2015, University of Maryland
https://phys.org/news/2015-08-evidence-subatomic-particles-defy-standard.html
The Standard Model of particle physics, which explains most of the known behaviors and interactions of fundamental subatomic particles, has held up remarkably well over several decades. This far-reaching theory does have a few shortcomings, however—most notably that it doesn’t account for gravity. In hopes of revealing new, non-standard particles and forces, physicists have been on the hunt for conditions and behaviors that directly violate the Standard Model.
Now, a team of physicists working at CERN’s Large Hadron Collider (LHC) has found new hints of particles—leptons, to be more precise—being treated in strange ways not predicted by the Standard Model. The discovery, scheduled for publication in the September 4, 2015 issue of the journal Physical Review Letters, could prove to be a significant lead in the search for non-standard phenomena.
The team, which includes physicists from the University of Maryland who made key contributions to the study, analyzed data collected by the LHCb detector during the first run of the LHC in 2011-12. The researchers looked at B meson decays, processes that produce lighter particles, including two types of leptons: the tau lepton and the muon. Unlike their stable lepton cousin, the electron, tau leptons and muons are highly unstable and quickly decay within a fraction of a second.
According to a Standard Model concept called “lepton universality,” which assumes that leptons are treated equally by all fundamental forces, the decay to the tau lepton and the muon should both happen at the same rate, once corrected for their mass difference. However, the team found a small, but notable, difference in the predicted rates of decay, suggesting that as-yet undiscovered forces or particles could be interfering in the process.
“The Standard Model says the world interacts with all leptons in the same way. There is a democracy there. But there is no guarantee that this will hold true if we discover new particles or new forces,” said study co-author and UMD team lead Hassan Jawahery, Distinguished University Professor of Physics and Gus T. Zorn Professor at UMD. “Lepton universality is truly enshrined in the Standard Model. If this universality is broken, we can say that we’ve found evidence for non-standard physics.”
The LHCb result adds to a previous lepton decay finding, from the BaBar experiment at the Stanford Linear Accelerator Center, which suggested a similar deviation from Standard Model predictions. (The UMD team has participated in the BaBar experiment since its inception in 1990’s.) While both experiments involved the decay of B mesons, electron collisions drove the BaBar experiment and higher-energy proton collisions drove the LHC experiment.
“The experiments were done in totally different environments, but they reflect the same physical model. This replication provides an important independent check on the observations,” explained study co-author Brian Hamilton, a physics research associate at UMD. “The added weight of two experiments is the key here. This suggests that it’s not just an instrumental effect—it’s pointing to real physics.”
“While these two results taken together are very promising, the observed phenomena won’t be considered a true violation of the Standard Model without further experiments to verify our observations,” said co-author Gregory Ciezarek, a physicist at the Dutch National Institute for Subatomic Physics (NIKHEF).
“We are planning a range of other measurements. The LHCb experiment is taking more data during the second run right now. We are working on upgrades to the LHCb detector within the next few years,” Jawahery said. “If this phenomenon is corroborated, we will have decades of work ahead. It could point theoretical physicists toward new ways to look at standard and non-standard physics.”
With the discovery of the Higgs boson—the last major missing piece of the Standard Model—during the first LHC run, physicists are now looking for phenomena that do not conform to Standard Model predictions. Jawahery and his colleagues are excited for the future, as the field moves into unknown territory.
“Any knowledge from here on helps us learn more about how the universe evolved to this point. For example, we know that dark matter and dark energy exist, but we don’t yet know what they are or how to explain them. Our result could be a part of that puzzle,” Jawahery said. “If we can demonstrate that there are missing particles and interactions beyond the Standard Model, it could help complete the picture.”
More information: The research paper, “Measurement of the ratio of branching fractions mathcal{B}(bar{B}^0 rightarrow Dast{+}tau-bar{nu}tau)/mathcal{B}(bar{B}^0 rightarrow Dast{+}mu-bar{nu}mu),” The LHCb Collaboration, is scheduled to appear online August 31, 2015 and to be published September 4, 2015 in the journal Physical Review Letters.
Journal reference: Physical Review Letters
Provided by: University of Maryland
April 25, 2018 at 2:49 pm #51979Michael WinnKeymasterThanks for these science posts, I enjoy monitoring science cosmology.
But we have to keep all this in perspective: the Standard Model and particles only describes 4% of the known cosmos. No particles in Dark Energy/Matter, which I believe is parallel to Taoist Pre-Natal Qi and Jing.
Alchemy is the path to experience the “light within the formless Qi field”.
April 26, 2018 at 6:48 am #51980rideforeverParticipantOne day “science” will have a model that covers 100% of the cosmos. On TV they will announce it and there will be special reports on it. You will watch it all with excitement.
And then you turn the TV off and return to your life and your reality.
Perhaps then you will realise some things.
August 6, 2018 at 3:41 pm #52800c_howdyParticipantThe universe’s rate of expansion is in dispute – and we may need new physics to solve it
August 6, 2018 by Thomas Kitching, The Conversation
https://phys.org/news/2018-08-universe-expansion-dispute-physics.html
Next time you eat a blueberry (or chocolate chip) muffin consider what happened to the blueberries in the batter as it was baked. The blueberries started off all squished together, but as the muffin expanded they started to move away from each other. If you could sit on one blueberry you would see all the others moving away from you, but the same would be true for any blueberry you chose. In this sense galaxies are a lot like blueberries.
Since the Big Bang, the universe has been expanding. The strange fact is that there is no single place from which the universe is expanding, but rather all galaxies are (on average) moving away from all the others. From our perspective in the Milky Way galaxy, it seems as though most galaxies are moving away from us – as if we are the centre of our muffin-like universe. But it would look exactly the same from any other galaxy – everything is moving away from everything else.
To make matters even more confusing, new observations suggest that the rate of this expansion in the universe may be different depending on how far away you look back in time. This new data, published in the Astrophysical Journal, indicates that it may time to revise our understanding of the cosmos.
Cosmologists characterise the universe’s expansion in a simple law known as Hubble’s Law (named after Edwin Hubble – although in fact many other people preempted Hubble’s discovery). Hubble’s Law is the observation that more distant galaxies are moving away at a faster rate. This means that galaxies that are close by are moving away relatively slowly by comparison.
The relationship between the speed and the distance of a galaxy is set by “Hubble’s Constant”, which is about 44 miles (70km) per second per Mega Parsec (a unit of length in astronomy). What this means is that a galaxy gains about 50,000 miles per hour for every million light years it is away from us. In the time it takes you to read this sentence a galaxy at one million light years’ distance moves away by about an extra 100 miles.
This expansion of the universe, with nearby galaxies moving away more slowly than distant galaxies, is what one expects for a uniformly expanding cosmos with dark energy (an invisible force that causes the universe’s expansion to accelerate ) and dark matter (an unknown and invisible form of matter that is five times more common than normal matter). This is what one would also observe of blueberries in an expanding muffin.
The history of the measurement of Hubble’s Constant has been fraught with difficulty and unexpected revelations. In 1929, Hubble himself thought the value must be about 342,000 miles per hour per million light years – about ten times larger than what we measure now. Precision measurements of Hubble’s Constant over the years is actually what led to the inadvertent discovery of dark energy. The quest to find out more about this mysterious type of energy, which makes up 70% of the energy of the universe, has inspired the launch of the world’s (currently) best space telescope, named after Hubble.
Now it seems that this difficulty may be continuing as a result of two highly precise measurements that don’t agree with each other. Just as cosmological measurements have became so precise that the value of the Hubble constant was expected to be known once and for all, it has been found instead that things don’t make sense. Instead of one we now have two showstopping results.
On the one side we have the new very precise measurements of the Cosmic Microwave Background – the afterglow of the Big Bang – from the Planck mission, that has measured the Hubble Constant to be about 46,200 miles per hour per million light years (or using cosmologists’ units 67.4 km/s/Mpc).
On the other side we have new measurements of pulsating stars in local galaxies, also extremely precise, that has measured the Hubble Constant to be 50,400 miles per hour per million light years (or using cosmologists units 73.4 km/s/Mpc). These are closer to us in time.
Both these measurements claim their result is correct and very precise. The measurements’ uncertainties are only about 300 miles per hour per million light years, so it really seems like there is a significant difference in movement. Cosmologists refer to this disagreement as “tension” between the two measurements – they are both statistically pulling results in different directions, and something has to snap.
So what’s going to snap? At the moment the jury is out. It could be that our cosmological model is wrong. What is being seen is that the universe is expanding faster nearby than we would expect based on more distant measurements. The Cosmic Microwave Background measurements don’t measure the local expansion directly, but rather infer this via a model – our cosmological model. This has been tremendously successful at predicting and describing many observational data in the universe.
So while this model could be wrong, nobody has come up with a simple convincing model that can explain this and, at the same time, explain everything else we observe. For example we could try and explain this with a new theory of gravity, but then other observations don’t fit. Or we could try and explain it with a new theory of dark matter or dark energy, but then further observations don’t fit – and so on. So if the tension is due to new physics, it must be complex and unknown.
A less exciting explanation could be that there are “unknown unknowns” in the data caused by systematic effects, and that a more careful analysis may one day reveal a subtle effect that has been overlooked. Or it could just be statistical fluke, that will go away when more data is gathered.
It is presently unclear what combination of new physics, systematic effects or new data will resolve this tension, but something has to give. The expanding muffin picture of the universe may not work anymore, and cosmologists are in a race to win a “great cosmic bake-off” to explain this result. If new physics is required to explain these new measurements, then the result will be a showstopping change of our picture of the cosmos.
Journal reference: Astrophysical Journal
Provided by: The Conversation
October 2, 2018 at 5:05 pm #53282c_howdyParticipantA supernova (bright spot at lower left) and its host galaxy (upper center), as they would appear if gravitationally lensed by an intervening black hole (center). The gravitational field of the black hole distorts and magnifies the image and makes both the galaxy and the supernova shine brighter. Gravitationally magnified supernovas would occur rather frequently if black holes were the dominant form of matter in the universe. The lack of such findings can be used to set limits on the mass and abundance of black holes. Credit: Miguel Zumalacárregui image, UC Berkeley.
Read more at: https://phys.org/news/2018-10-black-holes-universe-dark.html#jCp
https://phys.org/news/2018-10-black-holes-universe-dark.html
Black holes ruled out as universe’s missing dark matter
October 2, 2018, University of California – Berkeley
For one brief shining moment after the 2015 detection of gravitational waves from colliding black holes, astronomers held out hope that the universe’s mysterious dark matter might consist of a plenitude of black holes sprinkled throughout the universe.
University of California, Berkeley, physicists have dashed those hopes.
Based on a statistical analysis of 740 of the brightest supernovas discovered as of 2014, and the fact that none of them appear to be magnified or brightened by hidden black hole “gravitational lenses,” the researchers concluded that primordial black holes can make up no more than about 40 percent of the dark matter in the universe. Primordial black holes could only have been created within the first milliseconds of the Big Bang as regions of the universe with a concentrated mass tens or hundreds of times that of the sun collapsed into objects a hundred kilometers across.
The results suggest that none of the universe’s dark matter consists of heavy black holes, or any similar object, including massive compact halo objects, so-called MACHOs.
Dark matter is one of astronomy’s most embarrassing conundrums: despite comprising 84.5 percent of the matter in the universe, no one can find it. Proposed dark matter candidates span nearly 90 orders of magnitude in mass, from ultralight particles like axions to MACHOs.
Several theorists have proposed scenarios in which there are multiple types of dark matter. But if dark matter consists of several unrelated components, each would require a different explanation for its origin, which makes the models very complex.
“I can imagine it being two types of black holes, very heavy and very light ones, or black holes and new particles. But in that case one of the components is orders of magnitude heavier than the other, and they need to be produced in comparable abundance. We would be going from something astrophysical to something that is truly microscopic, perhaps even the lightest thing in the universe, and that would be very difficult to explain,” said lead author Miguel Zumalacárregui, a Marie Curie Global Fellow at the Berkeley Center for Cosmological Physics.
An as-yet unpublished reanalysis by the same team using an updated list of 1,048 supernovas cuts the limit in half, to a maximum of about 23 percent, further slamming the door on the dark matter-black hole proposal.“We are back to the standard discussions. What is dark matter? Indeed, we are running out of good options,” said Uroš Seljak, a UC Berkeley professor of physics and astronomy and BCCP co-director. “This is a challenge for future generations.”
The analysis is detailed in a paper published this week in the journal Physical Review Letters.
Their conclusions are based on the fact that an unseen population of primordial black holes, or any massive compact object, would gravitationally bend and magnify light from distant objects on its way to Earth. Therefore, gravitational lensing should affect the light from distant Type Ia supernovas. These are the exploding stars that scientists have used as standard brightness sources to measure cosmic distances and document the expansion of the universe.
Zumalacárregui conducted a complex statistical analysis of data on the brightness and distance supernovas catalogued in two compilations—580 in the Union and 740 in the joint light-curve analysis (JLA) catalogs—and concluded that eight should be brighter by a few tenths of a percent than predicted based on observations of how these supernovas brighten and fade over time. No such brightening has been detected.
Other researchers have performed similar but simpler analyses that yielded inconclusive results. But Zumalacárregui incorporated the precise probability of seeing all magnifications, from small to huge, as well as uncertainties in brightness and distance of each supernova. Even for low-mass black holes—those 1 percent the mass of the sun—there should be some highly magnified distant supernovas, he said, but there are none.
“You cannot see this effect on one supernova, but when you put them all together and do a full Bayesian analysis you start putting very strong constraints on the dark matter, because each supernova counts and you have so many of them,” Zumalacárregui said. The more supernovas included in the analysis, and the farther away they are, the tighter the constraints. Data on 1,048 bright supernovas from the Pantheon catalog provided an even lower upper limit—23 percent—than the newly published analysis.
Seljak published a paper proposing this type of analysis in the late 1990s, but when interest shifted from looking for big objects, MACHOs, to looking for fundamental particles, in particular weakly interacting massive particles, or WIMPs, follow-up plans fell by the wayside. By then, many experiments had excluded most masses and types of MACHOs, leaving little hope of discovering such objects.
At the time, too, only a small number of distant Type Ia supernovas had been discovered and their distances measured.
Only after the LIGO observations brought up the issue again did Seljak and Zumalacárregui embark on the complicated analysis to determine the limits on dark matter.
“What was intriguing is that the masses of the black holes in the LIGO event were right where black holes had not yet been excluded as dark matter,” Seljak said. “That was an interesting coincidence that got everyone excited. But it was a coincidence.”
More information: Miguel Zumalacárregui et al, Limits on Stellar-Mass Compact Objects as Dark Matter from Gravitational Lensing of Type Ia Supernovae, Physical Review Letters (2018). DOI: 10.1103/PhysRevLett.121.141101
Journal reference: Physical Review Letters
Provided by: University of California – Berkeley
-
AuthorPosts
- You must be logged in to reply to this topic.