Is arXiv a monopoly bully in scientific publication?

After decades of ever-increasing dominance, has become the largest and most popular storage space or eprint service for scientific publications in physics and other related fields. It would have been the most beneficial to the physics community had arXiv sticked to its original principles for sharing new ideas and works quickly. Sadly, is becoming more and more arrogant these days instead. Veiled censorship by arXiv is making it just like another giant refereed journal system but without any transparency.

This is especially bad because of two reasons. First, arXiv is the largest preprint archive in physics and the widest user community established since 1990s makes it indispensable. It does possess a monopoly status in physics publication, as there is no other similar service that can compete. Second, arXiv’s moderation is much worse than a normal journal’s peer review system as it is anonymous, no valuable response, and operated in a black-box way. There is no real mechanism to regulate arXiv’s bully behaviors as a monopoly. As such, arXiv has made new (especially non-orthodox or disruptive) ideas hard to spread.

There have already been quite some reports that exposed arXiv’s censorship issues. One of the most high-profile ones may be the experience by Brian Josephson (Nobel Prize laureate in 1973), as described in “Covert censorship by the physics preprint archive“.

I have endured similar issues with arXiv. I am blacklisted as well. I am no longer allowed to post any article to the HEP categories. The several articles I posted there before were moved to the general category: gen-ph, including a couple of them that were published in HEP journals (Phys. Lett. B and Phys. Rev. D). Even when I recently tried to post articles to the “allowed” gen-ph category, they’ve been on hold for nearly two months so far. I would not be surprised if they eventually deny both as they did for my other submissions earlier. Ironically, I wrote one of them for the Gravity Research Foundation 2021 Essay Competition and it has just earned Honorable Mention. What’s funnier is that at least 13 other papers submitted at the same time for this renowned competition got accepted by arXiv but have earned nothing in the competition. I am not saying that those 13 articles do not deserve to be in arXiv, they do; but why did arXiv block mine which the GRF referees regard scientific and better?

The best discussion on arXiv’s censorship could be the anonymous comments I found on the web. Indeed, it is not arXiv’s business to do the peer review on submissions. Instead, they should provide the best platform for helping “scientific startups” to grow into the next “scientific giants” (like in the business world). After all, it is the researchers who post their ideas on arXiv that bear all the risks.

What would be better ways of running arXiv? Most importantly, it should stick to its genuine original commitment – quick sharing and storage of new ideas. It should accept all honest scientific publications meeting the typical social (no abuse, prejudice, violence, plagiarism, etc.) and minimal scientific standards (excluding only clearly non-scientific works). Such minimal scientific standards are like the basic requirements for a business license. The business world has the most successful experience of fostering ample startup companies which occasionally grow into the next leaders of the market. The academic world, especially arXiv and funding agencies, need to learn from the business counterparts.

Many other issues on arXiv including the endorsement one originate from its over-regulation or arrogance. I guess that they consider themselves the defenders of scientific orthodoxy, just like the medieval Church to a lesser extent. Maybe we should be looking forward to the next true scientific Renaissance that will hopefully stop such pro-orthodox practices all together.

Just to show another example for arXiv’s arrogance, the forceful latex-source-submitting policy is getting more and more annoying. They might have a point in forcing people to use their latex system if they have the best and most-up-to-date latex compiling system. Instead they have an old system, missing features and packages. Even if one gets a manuscript well compiled under one’s own latex system, it may not work the same way in arXiv’s system. In such cases, one has to live with the ill-formatted pdf file generated by arXiv and sometimes it may not even compile under the arXiv’s system at all. But arXiv is still bullying you to use their system no matter what. They even adopt very aggressive steps to make sure that you can’t get around it. This is much worse than what Microsoft or any other business monopoly has done. You could still get around the restrictions (e.g., browsers) set by business giants. But you can not when it is arXiv.

In the end, it is the authors of the intellectual work who should have the right on how the work is presented. Unlike the regular journals, arXiv is not a real publication system, but a storage one. They could ask authors to upload their source files and MEANWHILE give them the option to use their own pdf file for the final presentation. By no means, arXiv should force one to use their compiling system, especially a poor one.

They list five reasons of forcing the latex-source upload which do nothing but expose their arrogance and absurdity:

  1. “latex is plain text and easy for future document migrations.” This is probably the most credible excuse. But the most important thing for a scientific work is to keep it authentic. Most likely, authors  have checked the pdf file they generated themselves much more carefully than what arXiv created. The original author-generated pdf file should be automatically in the upload. If there is any doubt, that should be the reference.
  2. “Using emerging new technology -> hyperlinks” How many times have they reprocessed the old manuscripts? Do people really care? Are the old stuff/links still working? What if the reprocessing messed up things – you’d need the original pdf file for comparison.
  3. This reason just states latex can generate pdf. There is no point if author-generated pdf file is provided.
  4. “There is no single Postscript standard” is for ps files but not for pdf files
  5. “Cross-referencing within arXiv is added automatically with hyperlinked Postscript” again for ps files not pdf files.

All these questionable / bullying practices by arXiv have caused some physicists’ counteractions. In particular, a more lenient archive system viXra was established for unorthodox articles. A website of was created against arXiv’s blocking activities and censorship. There are also other preprint services emerging, e.g., OSF Preprints,, etc. However, none of them has comparable influence as arXiv does yet. For healthy advancement of science, we need a better service for preprints. Reform on this and other aspects (e.g., funding) by advocating the principles of open science will be critical for avoiding the stifled progress in science.

Over-moderation makes another refereed journal

I just so happened to stumble upon a very interesting comment on arXiv’s moderation issues (certainly more interesting than the original blog article). I could not agree more with the commenter. It is almost exactly like what I would like to write about the issue but probably not as well as the anonymous commenter articulated. In particular, I made similar comparisons between science and business; startup companies vs. researchers with risky/novel scientific ideas, etc. I can not help but fully quote the comment here,

Continue reading “Over-moderation makes another refereed journal”

New paper on first principles

I just tried to post my new paper of “First Principles of Consistent Physics” on Unfortunately it was put on hold immediately and I then submitted it to the OSF eprint server. This is quite an exciting paper to me. It proposes new foundations and guiding principles on fundamental physics and cosmology based and improved upon my early blog “first principles of physics“. It should shed new light on further developments of the new mirror framework.

Continue reading “New paper on first principles”

First principles of physics

The approach of first principles has been pursued in the development and history of physics. Ever since the establishment of the Standard Model of particle physics in 1970s, the idea of going after theory of everything has become popular as the latest approach of first principles among theoretical physicists for unifying all particles and interactions. However, we seem to live in a dynamic world as indicated, e.g., since the discovery of an expanding Universe and it is definitely at odds with the static picture of an ultimate unified theory for physics.

The dynamic picture tells us that the time reversal symmetry has to be broken and it has to be the first (broken) symmetry. Whatever first principles we propose have to be able to naturally break this symmetry first in the very beginning. And there is no reason why the current 4-dimensional spacetime, in particular, its dimensions can’t be dynamic. It is probably more natural to consider that spacetime has evolved in a dimension-by-dimension way.

First of all, we propose and summarize the three first principles as follows:

  1. A measurable finite physical world is assumed.
  2. The quantum version of the variation principle in terms of Feynman’s path integral formalism is applied.
  3. Spacetime emerges via dimensional phase transitions (i.e., first time dimension and then space dimensions got inflated).
Continue reading “First principles of physics”

How should private foundations support science?

It is amazing that there exist quite some private foundations in the United States who care about science and are enthusiastic about funding scientific projects. However, a lot of them, if not all, don’t seem to know how they should support science in a complementary way when compared to government funding agencies like NSF and DoE.

Continue reading “How should private foundations support science?”

Time to reform peer-review

One of the critical features in scientific research is the application of the so-called peer-review process before a scientific paper is officially published in a journal. Ideally, peer-review, at least seemingly in its original purpose, should serve as a measure of quality control that benefits both the authors and the readers. However, nowadays, it becomes more and more like an obstruction to the advancement of science, in particular, in terms of radically new ideas and directions.

Continue reading “Time to reform peer-review”

How can a new idea be accepted by eminent physicists?

In a nostalgic review article titled “Twenty years of the Weyl anomaly” [Michael J Duff, Class. Quantum Grav. 11, 1387 (1994)], Duff recalled the history of his discovery of the Weyl or conformal anomaly in quantum theory with Derek Capper. Continue reading “How can a new idea be accepted by eminent physicists?”

Old Wine in New Bottles – How does science advance?

A lot of times science advances by incorporating or interpreting old ideas under new scenarios.

For example, Lorentz first proposed the so-called Lorentz transformation, but it was Einstein who correctly interpreted and applied it in his theory of special relativity. Yang and Mills first came up with the SU(2) gauge theory idea for studying nuclear isospin. But it was Glashow, Weinberg, Salam , and ‘t Hooft who found the best application of the idea to the electroweak interaction eventually leading to the most celebrated unification theory (called the Standard Model) for all three gauge interactions of the known elementary particles.

Continue reading “Old Wine in New Bottles – How does science advance?”

Does the Universe Have a Mirror Sector?

[This is a repost of the popular introduction page on the new mirror matter theory]

Modern physics is pillared by Einstein’s theory of general relativity (that defines spacetime and the gravitational force) and the Standard Model as the best known quantum theory (that governs quantum particles and the other known interactions). Despite tremendous successes of the two theories and decades of more scientific efforts, there remains a wide range of puzzling phenomena in fundamental physics and the dream of unification of general relativity and quantum theory has never come true.

Continue reading “Does the Universe Have a Mirror Sector?”

Invisible decays and equivalence of CP violation and mirror symmetry breaking scales

COVID-19 pandemic has hindered my scientific production quite a bit. But finally my new paper on “invisible decays of neutral hadrons” is finished though it should have been done months ago. It provides precise predictions on invisible decay branching fractions of long-lived neutral hadrons that can be readily measured at existing collider facilities. The idea is that CP violation can be considered as a direct result of spontaneous mirror symmetry breaking at staged quark condensation (e.g., at temperatures of 100GeV – 100 MeV in the early Universe). For a neutral kaon system, it means that the CP and mirror breaking scales, i.e., the mixing strength and mass splitting parameters should be the same.

Continue reading “Invisible decays and equivalence of CP violation and mirror symmetry breaking scales”