...(continued)I agree that OSD is very expensive (whether that's for SW running time or ASIC implementation); so replacing it with other decoders like that paper is doing with AC is a move in the right direction. I haven't looked at AC in detail to judge its complexity yet; but that's in the ever growing queue ;-
...(continued)I agree with your comment. Very recently, we also submitted a similar paper:
https://arxiv.org/abs/2503.17307
Originally, we also wanted to name it exactly like this one: "Quantum theory does not need complex numbers". But then we changed the title because - exactly - what we all do is simulation
...(continued)A little comment on this, while the 10k iterations sound too much seems that in terms of time is better to have 1) 10k BP iterations+ OSD than 2) 20 iterations of BP + OSD. Basically you will be allowing BP to converge more times and you'll require the expensive OSD less times. They comment this th
...(continued)Congratulations on the wonderful results! I am wondering if the local (approximate) Markovian is proved under the assumption that there is no tological order. As in BK19, the contribution from topological entangle entropy may violate the condition for the existence of a recover map for non-contracti
...(continued)With the two-reals struct, one may run into different unpleasant issues for the resulting theory. Mathematically, it looks like a two-level system, but with severe non-locality issues. If you admit that it's only a formal system not existing as a localized quantum system, you are introducing differe
...(continued)You're combining the two agents using the wrong tensor product. I said the programmer would implement a tensor product that behaved on the two-reals struct according to the usual rules for complex numbers. You are instead describing a tensor that treats the two fields of the struct as if they were a
...(continued)Dear Craig,
You will find this concern addressed in reference [M.-O. Renou et al, Nature 600, 625-629, 2021] cited in the abstract of this paper. Essentially, your idea is to read one complex number as two real numbers. This brings up the issue of introducing an extra degree of freedom that stores
...(continued)I'm sorry, this is going to be rude, but could you confirm whether or not this is intended as an april fools joke? It's hard to tell this week.
It's already known that you can implement the complex numbers using only real numbers. The common mapping is into pairs of real numbers. In the C program
...(continued)Thanks for the script; I'll take a look at it in more detail later (it's late here).
I did a quick calculation of the weight enumerators of the $[[30,4,5]]$ codes in the two papers and they did come out to be exactly the same (a bit of a surprise). Here are the first few terms0 1
...(continued)@qodesign My co-author, Tobias Haug, compared the Tanner graph of the [[30,4,5]] code from our paper and Nicolas's paper and found that they are isomorphic. Here is the link to the Python script on Google Colab: https://colab.research.google.com/drive/1IkHHI6Uz6du5aMCNK1WR58CeAOkPSVRC?usp=sharing.
Congratulations for this very interesting work! We also considered a different BP ensemble technique for LDPC codes in https://arxiv.org/abs/2503.01738. It would be interesting to benchmark the different approaches and see if a merged ensemble would work.
Congratulations for this very interesting work! We also considered a different BP ensemble technique for LDPC codes in https://arxiv.org/abs/2503.01738. It would be interesting to try your approach for circuit level noise and see if a merged ensemble of both our methods would work.
Thanks, Zeyao. Sadly, all we're left with is absurdity.
Fortunately, we live in the best of all possible worlds.
Thanks, Andrea, and thanks for the correction.
Sadly, we tossed out all the samples (and friends) made along the way.
Very nice suggestion!
...(continued)Nice result! the proof of the bounds' optimality is so elegant. It's funny how results about extremal codes depend a lot on each other. We've reached the 'good ldpc -> bpt-saturating codes -> locality-tradeoffs-saturating codes' part of the chain. I wonder if there is anything up next?
Also, do you
Step 1 seems time consuming and not strictly necessary for Step 2.
Hi Earl Campbell:
I really appreciate for your comments! I will discuss about it in the next version!
...(continued)While I have only read the title and figure captions of this, I feel compelled to comment on Figure 1 which exaggerates the exponential growth in citations.
It is a well-known phenomenon (anecdotal evidence not cited) that Google scholar overestimates citations in the previous year. This can g
This is related with the long-term behavior as discussed in the last section.
Why didn't I see the exponential quantum advantage from your PhD supervisor's citations?
The brilliant application of reductio ad absurdum!
Dear Bori,
thank you for your comment. Regarding your final question, please see my answer to Zhiyuan above. We will clarify this some more in version 2 of our paper.
Best wishes,
Markus
...(continued)Dear Zhiyuan,
thank you for this, I agree this is a natural follow-up question to ask. And I agree that the answer should be “no”. In addition to the local-versus-global invariance that you have pointed out again, let me give another view on this.
Our starting point is the postulate that two (
...(continued)Hi Kishore,
In your paper $z=xy$ so it might have been a better way to describe the codes as polynomials in two variables. The $[[30,4,5]]$ code in Nicolas's paper would be : $$[1+x|1+y+x^2y^2],l=5,m=3$$ which is different from your $$[x+x^4y^4|x+y^2+x^2y^2],l=3,m=5$$ these are small enough codes
Just wait until a quantum AI has found our universe. It will mine all the online text data from all the other universes to write a better paper.
typo, bottom of pg 5: "you clearly cannot have quantum advantage with an imaginary number in there somewhere" shouldn't it be _without_ ?
Excellent paper otherwise. As the saying goes, the real multiverse were the samples we made along the way
Hi! To chip in. For the codes from https://iopscience.iop.org/article/10.1088/2058-9565/ad5eb6/pdf , we also found a small number of BP Iterations to perform best. We think it's due to the relatively large number of small cycles and symmetric stabilizers.
...(continued)Thanks, Nicolas. Since Q_l is a circulant matrix, your A (A1 + A2 in your paper) and B (A3 + A4 + A5 in your paper) matrices are circulant and, hence, can be expressed as multivariate polynomials. Thus, your codes can be captured by our multivariate formalism. I explicitly calculated the parity-chec
Thank you Kishor! The parity check matrices of our [[30,4,5]] and [[48,4,7]] codes are described in Table II.
...(continued)Thanks, Nicolas. I loved reading your work yesterday. In our paper, we’re working with the multivariate polynomial quotient ring to pick out matrices A and B. I noticed you go with an exhaustive search to select permutation matrices, then tweak the number of terms in A and B to find low-weight codes
...(continued)interesting data point! it's not what I usually see but I'll take a look at your results to see why. I don't use the python BP-OSD package (I have a homegrown c version) and I might select the candidates differently. Anyway it's worth looking into this in more details. As far as practical implementa
...(continued)R.e. the large number of BP iterations, my coauthors and I observed something similar when studying a different code family with comparable parameters https://arxiv.org/abs/2406.14445. In appendix E you can see that the decoder performance continues to increase all the way up to 10000 BP iterations.
...(continued)Thanks for the clarification.
The parameters seem a bit unusual : 10000 iterations is huge (normally you get to diminished returns after ~20 iterations); also the osd_order seems low. Most other papers use ~50. Although not all sources define "osd_order" the same way; the number of candidates con
...(continued)Hi @qodesign,
Thanks for pointing out that the parameters of BP-OSD are missing. We will include them in the next version. For convenience here they are:
max_bp_iters = 10_000
bp_method = ''min_sum''
osd_order = 5
osd_method = 'osd_cs''As a sanity check to test our simulator, we reproduced the s
...(continued)What are the parameters of BP-OSD?
I'm familiar with the paper in the first comment; it contains a lot of results but again the BP-OSD parameters are not given and the noise models are not clear. I couldn't verify much of the claims there.
Also it looks like one paper reports logical error rat
...(continued)Thanks Kishor for letting us know. We will add a ref to your paper in the next version.
Edit: A few comments after having a closer look at your paper. First it's a nice paper, congrats! These variants of BB codes are indeed similar and we will add a link to your paper in the next version of ours.
Congratulations! Just wanted to mention that we had proposed the weight-5 [[30,4,5]] code and several other such codes in https://arxiv.org/abs/2406.19151 (Page 3, Table 1 and the supplementary material.)
Reminds me of a Nirvana song called "Smells like Quantum Advantage".
How can emerging physical-layer security strategies be systematically integrated into 6G ISAC networks to simultaneously safeguard both communication data and sensing information, while preserving the ultra-low latency and dual functionality that define ISAC’s commercial appeal?
How can the 6GStarLab mission, envisioned as an open, flexible in-orbit platform, enable experimentation of future 3GPP NTN standards across multiple frequency bands and optical links while ensuring reliability, scalability, and adaptability for diverse use cases in dynamic orbital environments?
...(continued)How can a networked ecosystem of autonomous and embodied AI agents, each potentially possessing distinct goals, learning algorithms, and generative foundation models, maintain consistent, coherent, and up-to-date shared knowledge while operating under dynamic network conditions and diverse task requ
Shouldn't it actually be denoted by Ü, or would that be the umlaut umlaut information?
No problem, thank you!
Oh that's so cool! Sorry I missed it. I'll add a reference to your work in a future update :)
...(continued)Hi Nouédyn,
Congrats on your result! With regards to the open question that you stated about self-correcting memories in dimensions less than 4D, we wanted to share our [work which discusses self-correcting memories on fractal lattices][1] with Hausdorff dimension $D_H=4-\epsilon$, obtained by punc
...(continued)Thanks to both of you for your explanations. All this makes sense except that I am not fully convinced that we even know for which instance *sizes* quantum advantage is expected. It is plausible to me that at the sizes where random instances become classically intractable, all instances either conce