No one will touch this guy with a ten foot pole. Nothing he did technically was novel - it was just that everyone who had the skills to edit an embryo was unwilling and uninterested in doing so. Having him as part of your organization basically broadcasts to the world that you’re going to be doing wildly unethical things. Not a great path to commercialization of any therapeutic.
After reading the abstract I'm not sure what they are trying to prove. None of their examples are relevant to "spontaneous" emergence of hierarchy, they are all somehow tied to environmental or economic factors.
Hierarchy is definitely useful in some cases but has interesting tradeoffs. In emergency conditions it's very useful to have a strong hierarchy (especially if the leader has experience with that type of emergency), but during 'good times' strong top-down regulation represses creativity and adaptability.
Alternating between phases of hierarchy to consolidate good ideas from phases with high generation of ideas/diversity is probably ideal, and is probably what I would have looked into if I was studying hierarchy.
I'm going to read more of the thesis to be sure, but part about VDJ recombination seems tenuous - the fact that some aspects of VDJ recombination are regulated or vary between individuals shouldn't surprise anyone since environments and diseases vary all over the world. It's also not a new finding.
Here's some better reading about the origins of antigen receptor diversity, or as some people call it, the Generation of Diversity (GOD):
Evolutionarily conserved TCR binding sites, identification of T cells in primary lymphoid tissues, and surprising trans-rearrangements in nurse shark (2010)
https://pubmed.ncbi.nlm.nih.gov/20488795/
Makes sense that his thesis was in biophysics, not in biology itself. in a biology department someone would probably have disillusioned him of his top-down control tendencies
Lost me at "The main theme of biology in twentieth-century is an attempt to reduce biological
phenomena to the behavior of molecules". Maybe the theme of biophysics in the 80s-2000s, but certainly not all of biology. Evolution? The central dogma? The cell + DNA+ evolution is what I'd put as the main themes. At least toward the end of century in biophysics the ideas of emergence and hierarchy can be found in any biology or biophysics textbook.
Having done it myself, I really hate the apparently irresistible pull to set up a straw man of your field in the abstract/intro then saying your minor results resolve it. I guess it's part of science now, but I wish it could at least be confined to job talks(1).
Continuing "We argue here that "hierarchy" is a critical level of biological organization". Welcome to the club. Again, any biology/biophysics textbook worth its salt from the 90s on (conservatively) would include probably by page 50 a picture/discussion of the multiple scales involved and probably even mention hierarchical organization explicitly.
It's just hard to take seriously. What is he actually trying to prove/show? Searching Google scholar Im prematurely concluding he applied existing clustering methods (clustering was very sexy in statistical physics right around 2010) and found some modularity across scales. You couldn't throw a rock 10 feet in a physics/biophysics department around that time without finding someone doing some clustering study to show some modular/hierarchical structure in some biological or otherwise "complex" system (trade networks in his case).
Bah I think I'm just in a bad mood lol don't mind me.
Edit- I just noticed he threw in spontaneous. I don't understand what that adds to the description besides making it sound more complicated.
(1) Which reminds me of one job talk I sat in (physics department) where the speaker tried to pass off levinthal's "paradox" of protein folding as unresolved until he graced the field with his brilliance. Maybe he thought no one in the department knew anything about proteins? I was almost impressed by the boldness.
I had GPT-5 summarize those 200 pages. Forgot to remove the "robot" personality, and initially provided a bunch of engineering-oriented concepts as "summary". Quite an interesting take:
Non-robot version:
Complex systems stay healthy when they have a small, stable core and a flexible edge. Put the non-negotiables in the core (e.g., data formats, auth, money flows) and keep them steady; let everything else move fast behind small, well-defined “doors.” This makes changes safer and keeps failures from spreading.
Watch for early warning signs of fragility by taking a simple weekly snapshot of “who talks to whom.” If you see more cross-team links, features that touch many parts at once, rising shared state, slower reviews, and more incidents at the same time, the structure is getting tangled. Short term, act like traffic control: add queues, throttle chatty components, turn off non-essential cross-links, and put a clear decision point in the middle until things calm down. Then clean up: shrink interfaces, move logic back into the right modules, delete shortcuts, and keep the core small.
For fast-changing threats or products (like flu strains or quick-iterating models), run a rolling check: each month, map new versions by “how different from today’s target” and “how common.” When a new cluster is far enough away and growing, switch targets or branch a new baseline. Weight recent data more so you react quickly, but keep older patterns around for backup.
Robot/Nerd version:
Many complex systems work best when built as hierarchical modules: a small, stable kernel (shared rules or core services) and a faster-evolving periphery connected through narrow, explicit interfaces. Define the kernel by a dependency graph’s center (k-core, betweenness, in-degree) and freeze it between releases; let the periphery change under tests that enforce interface contracts and resource ownership. This structure increases robustness to shocks and preserves evolvability.
Instrument the system as a time-sliced interaction graph and track structure: modularity (Q) (Newman–Girvan), hierarchy indices (Krackhardt (H), cophenetic correlation from a dendrogram), depth via k-core levels, density, clustering, and assortativity. Use control charts or EWMA to flag regime shifts; a “flattening” pattern is falling (H)/cophenetic, falling (Q), rising density without added depth. When flagged, respond with high-leverage moves: restore module boundaries, add buffers/queues, reduce cross-module coupling, and if needed apply temporary central coordination during the acute phase, then return authority to modules once metrics normalize.
For fast-drift domains (e.g., influenza strains or rapidly iterated model versions), run a rolling pipeline: monthly sequence or feature alignment; compute an effect-relevant distance (e.g., epitope-weighted “p_epitope” for HA, or capability-weighted deltas for models); embed to 2D (MDS/UMAP) and cluster (DBSCAN/HDBSCAN); declare an emerging cluster when its centroid crosses a pre-validated distance threshold from the reference and its prevalence or growth rate exceeds your preset cutoff; act (update vaccine strain/target or branch a new baseline). Maintain a recency-weighted memory that favors the newest clusters while retaining older patterns for baseline coverage.
For context, I think this is the same infamous
https://en.wikipedia.org/wiki/He_Jiankui
https://knowyourmeme.com/memes/people/jiankui-he
Dude genetically engineered babies to be immune to HIV.
I bet this guy is flush with money offers despite the ethics and legality of what he does, and the years he spent in jail.
No one will touch this guy with a ten foot pole. Nothing he did technically was novel - it was just that everyone who had the skills to edit an embryo was unwilling and uninterested in doing so. Having him as part of your organization basically broadcasts to the world that you’re going to be doing wildly unethical things. Not a great path to commercialization of any therapeutic.
He Jiankui is better known for performing the first germ-line (i.e. inheritable by children) genome editing of humans.
After reading the abstract I'm not sure what they are trying to prove. None of their examples are relevant to "spontaneous" emergence of hierarchy, they are all somehow tied to environmental or economic factors.
Hierarchy is definitely useful in some cases but has interesting tradeoffs. In emergency conditions it's very useful to have a strong hierarchy (especially if the leader has experience with that type of emergency), but during 'good times' strong top-down regulation represses creativity and adaptability.
Alternating between phases of hierarchy to consolidate good ideas from phases with high generation of ideas/diversity is probably ideal, and is probably what I would have looked into if I was studying hierarchy.
I'm going to read more of the thesis to be sure, but part about VDJ recombination seems tenuous - the fact that some aspects of VDJ recombination are regulated or vary between individuals shouldn't surprise anyone since environments and diseases vary all over the world. It's also not a new finding.
Here's some better reading about the origins of antigen receptor diversity, or as some people call it, the Generation of Diversity (GOD):
Another manifestation of GOD (2004) https://www.nature.com/articles/430157a
Evolutionarily conserved TCR binding sites, identification of T cells in primary lymphoid tissues, and surprising trans-rearrangements in nurse shark (2010) https://pubmed.ncbi.nlm.nih.gov/20488795/
Evidence of G.O.D.’s Miracle: Unearthing a RAG Transposon (2017) https://pmc.ncbi.nlm.nih.gov/articles/PMC5428540/
Origin of immunoglobulins and T cell receptors: A candidate gene for invasion by the RAG transposon (2025) https://pubmed.ncbi.nlm.nih.gov/40614193/
edit, did not realize this was written by the He Jiankui, https://en.wikipedia.org/wiki/He_Jiankui#Human_gene-editing_...
Makes sense that his thesis was in biophysics, not in biology itself. in a biology department someone would probably have disillusioned him of his top-down control tendencies
I was going to say that I was taught that VDJ recombination is pseudo-random at best, working generally from proximal to distal segments.
Would be interesting to extend to observations of chaos or entropy one level above each recognizable hierarchy, forcing a new organizing paradigm.
I believe I’ve done that here:
https://kemendo.com/GTC.pdf
Lost me at "The main theme of biology in twentieth-century is an attempt to reduce biological phenomena to the behavior of molecules". Maybe the theme of biophysics in the 80s-2000s, but certainly not all of biology. Evolution? The central dogma? The cell + DNA+ evolution is what I'd put as the main themes. At least toward the end of century in biophysics the ideas of emergence and hierarchy can be found in any biology or biophysics textbook.
Having done it myself, I really hate the apparently irresistible pull to set up a straw man of your field in the abstract/intro then saying your minor results resolve it. I guess it's part of science now, but I wish it could at least be confined to job talks(1).
Continuing "We argue here that "hierarchy" is a critical level of biological organization". Welcome to the club. Again, any biology/biophysics textbook worth its salt from the 90s on (conservatively) would include probably by page 50 a picture/discussion of the multiple scales involved and probably even mention hierarchical organization explicitly.
It's just hard to take seriously. What is he actually trying to prove/show? Searching Google scholar Im prematurely concluding he applied existing clustering methods (clustering was very sexy in statistical physics right around 2010) and found some modularity across scales. You couldn't throw a rock 10 feet in a physics/biophysics department around that time without finding someone doing some clustering study to show some modular/hierarchical structure in some biological or otherwise "complex" system (trade networks in his case).
Bah I think I'm just in a bad mood lol don't mind me.
Edit- I just noticed he threw in spontaneous. I don't understand what that adds to the description besides making it sound more complicated.
(1) Which reminds me of one job talk I sat in (physics department) where the speaker tried to pass off levinthal's "paradox" of protein folding as unresolved until he graced the field with his brilliance. Maybe he thought no one in the department knew anything about proteins? I was almost impressed by the boldness.
(2011)
I had GPT-5 summarize those 200 pages. Forgot to remove the "robot" personality, and initially provided a bunch of engineering-oriented concepts as "summary". Quite an interesting take:
Non-robot version:
Complex systems stay healthy when they have a small, stable core and a flexible edge. Put the non-negotiables in the core (e.g., data formats, auth, money flows) and keep them steady; let everything else move fast behind small, well-defined “doors.” This makes changes safer and keeps failures from spreading.
Watch for early warning signs of fragility by taking a simple weekly snapshot of “who talks to whom.” If you see more cross-team links, features that touch many parts at once, rising shared state, slower reviews, and more incidents at the same time, the structure is getting tangled. Short term, act like traffic control: add queues, throttle chatty components, turn off non-essential cross-links, and put a clear decision point in the middle until things calm down. Then clean up: shrink interfaces, move logic back into the right modules, delete shortcuts, and keep the core small.
For fast-changing threats or products (like flu strains or quick-iterating models), run a rolling check: each month, map new versions by “how different from today’s target” and “how common.” When a new cluster is far enough away and growing, switch targets or branch a new baseline. Weight recent data more so you react quickly, but keep older patterns around for backup.
Robot/Nerd version:
Many complex systems work best when built as hierarchical modules: a small, stable kernel (shared rules or core services) and a faster-evolving periphery connected through narrow, explicit interfaces. Define the kernel by a dependency graph’s center (k-core, betweenness, in-degree) and freeze it between releases; let the periphery change under tests that enforce interface contracts and resource ownership. This structure increases robustness to shocks and preserves evolvability.
Instrument the system as a time-sliced interaction graph and track structure: modularity (Q) (Newman–Girvan), hierarchy indices (Krackhardt (H), cophenetic correlation from a dendrogram), depth via k-core levels, density, clustering, and assortativity. Use control charts or EWMA to flag regime shifts; a “flattening” pattern is falling (H)/cophenetic, falling (Q), rising density without added depth. When flagged, respond with high-leverage moves: restore module boundaries, add buffers/queues, reduce cross-module coupling, and if needed apply temporary central coordination during the acute phase, then return authority to modules once metrics normalize.
For fast-drift domains (e.g., influenza strains or rapidly iterated model versions), run a rolling pipeline: monthly sequence or feature alignment; compute an effect-relevant distance (e.g., epitope-weighted “p_epitope” for HA, or capability-weighted deltas for models); embed to 2D (MDS/UMAP) and cluster (DBSCAN/HDBSCAN); declare an emerging cluster when its centroid crosses a pre-validated distance threshold from the reference and its prevalence or growth rate exceeds your preset cutoff; act (update vaccine strain/target or branch a new baseline). Maintain a recency-weighted memory that favors the newest clusters while retaining older patterns for baseline coverage.