A friend and I recently wondered about how much carbon dioxide one could pull from the air using living organisms. Neither of us have an intuitive sense of scale for this kind of thing (or any experience with plants), so we did some back of the envelope calculations from a physical/chemical perspective.
Burning wood is a reaction that converts wood into water, carbon dioxide, and light. Growing a tree is basically this process in reverse (Feynman famously observed that trees grow out of the air, not the ground).
We can bound the solar-powered growth rate of a living organism with just two numbers:
Dividing the latter by the former gives 5e-5 kg/(s m^2) and luckily there’s about 1e5 s/day, which gives us a bound of 5 kg/(m^2 day), which feels like the right ballpark. Call it 2 kg/m^2, since the sun isn’t shining for the whole day.
For a sense of scale: Joules (J) are units of energy and Watts (W) are units of power; 1 Watt is a Joule per second and a typical electric kettle is about 1e3 W.
This first calculation assumes solar energy is the limiting factor; I think that’s plausible:
There’s plenty of carbon in the air: 160 g per cubic meter, so to get 5 kg we’d need less than 50 cubic meters daily, which certainly feels doable from a wind-blowing perspective.
How does this bound compare with actual observed biomass growth rates?
I’ve heard that bamboo grows fast; Wikipedia reports 4 cm/hour. Eyeballing Google images suggests bamboo has 6 cm diameter; 25 cm^2 with 10% cross section of wood (bamboo is hollow) gives 10 cm^3/hour. A density of 0.6 g/cm^3 gives about 130 g/day (assuming 8 hours of sunlight).
I have no idea how many bamboo stalks could fit within a square meter, but 4 puts us at our theoretical upper bound. Definitely in the same ballpark.
These are still hypothetical numbers; what are typical yields?
The first article I found (“Bamboo can be more profitable than sugarcane and rice! Check out how”) gushes about Bambusa balcooa annual yields of 40 tonnes per acre, or about 27g / (m^2 day).
This is quite close to algae yields in open pond bioreactors: 25 g / (m^2 day).
Our World in Data on crop yields is probably more reputable than bamboo farming clickbait, and surprisingly lists a few crops as even more productive: Peruvian sugar cane comes in at 33 g / (m^2 day) and Dutch tomatoes at 334 g / (m^2 day) (though tomatoes are probably not a great reference from a carbon perspective, as they’re mostly water).
All these are grow around 20 times less than the theoretical bound of 500 g / (m^2 day).
Am I missing a big factor somewhere? Perhaps nitrogen uptake or the entropic losses of, uh, being a complex living plant?
Or should I be happy that my single-digit back of the envelope calculation is as close as it is?
I’ll be doing more of these sorts of calculations, so let me know if you have any favorite examples or resources. I’m currently reading Cell Biology by the Numbers and shopping around for a good calculation medium so I’m not always typing things like “density of wood * 123 m^3 / hectare / 365” into Google. (I’ll probably either re-learning Mathematica or use some kind of DimensionEngine spreadsheet plugin).
Developing software using popular languages and their library ecosystems relies on a lot of trust.
We must frequently run commands like
cargo run, and
pip install so often that it’s easy to forget what these commands do: Download arbitrary code from Internet strangers and run it on our machines with full network and read/write permissions to our SSH keys, browser cookies, and exciting folders like
This feels to me increasingly irresponsible:
Since I can’t hope to read (or understand!) all the code that I need to run, the best I can do is minimize the data at risk. If I run a new dependency’s build script in Project A, I’ll have to risk A’s source code being stolen, but I shouldn’t have to risk Project B’s source code (or my taxes!).
While this isn’t a new idea, it doesn’t seem to be an established practice. Most all developers I know (including myself, up until last month) live the YOLO life, constantly downloading and running arbitrary code on the same computer (and under the same user account) they use to store their photos, financial records, etc.
What are our options for having a bit more security?
As with everything, it’s a matter of tradeoffs. Some solutions I considered, but ruled out:
Separate computers: Do your untrusted, running-arbitrary-code computations on a completely separate machine from the one in which you do your taxes, login to your bank accounts, etc. One security researcher recommended this to me and it’s a reasonable solution (“chromebooks are cheap”) though I found it a bit too inconvenient given my frequent travel. Plus there’s a gray area of things I’d like to keep secure but need for dev work (e.g., Github, AWS logins).
QubesOS: “A reasonably secure operating system”. I ruled this one out because I work on a Mac. Otherwise it looks like the most baked “principle of least privilege” computing solution. I’d use this if I worked on Linux day-to-day.
Virtual machines. I spent a dozen hours looking into options here, in particular using Vagrant to provision project-specific VMs. Advantages include reproducibility benefits and other conveniences (snapshots); downsides are increased resource usage and some awkwardness for specific workflows, especially graphical ones (latency; poor touchpad support). I ultimately ruled this out because:
Docker. Same drawbacks as virtual machines, but in addition includes a lot of new tools/concepts I’d need to learn to use it effectively. (Plus those concepts won’t generalize to helping me run Windows on my Mac hardware.)
The solution I settled on is one that’s built into MacOS: Sandboxing.
Mac ships with a
sandbox-exec command that lets one run commands with a fine-grained set of permissions.
As with so many problems, the interesting bit isn’t really the technical feasibility, but the usability: How can I make
sandbox-exec easy enough to use that I’ll actually do most of my computing within a sandboxed environment and not accidentally (or deliberately) regress to the YOLO life?
My solution was to write a shell script,
sb, that makes it easy to enter a sandbox.
sb alone open a shell without network permissions and which only has access to current subdirectories.
I can also run specific commands within a sandbox:
sb online -- yarn install will install packages (gotta be online for that), while preventing any of those package’s build scripts from accessing anything beyond the current directory subtree.
I also updated my shell to give me cute emoji-icons to remind me of its current capabilities; my prompt looks like
where 🏠 means “can access home directory”; 📡 means “can access network”.
Check the sandbox script out for yourself, read the source, and let me know if you have any suggestions.
For a defense-in-depth strategy, I also run Little Snitch to monitor/block by default all network connections, and that’s worth checking out as well if you want to be a bit more secure in your computing.
“[A] motorised monolithic 3D-printed plastic flexure stage with sub-100 nm resolution that can perform automated optical fibre alignment.”
If that’s not impressive enough, how about a 3D printer that makes you a sandwich.
“What happened? A timeline on the development and approval of Moderna, Pfizer, and AstraZeneca’s vaccines.”
“TyBot is a completely autonomous rebar tying robot that ties up to 1,000 intersections an hour.”
Microbially-produced xanthan gum is not only an acceptable food-thickener but is one of the most promising agents for enhanced oil recovery in the petroleum industry.
“Stanford… has reverse-engineered and sequenced mRNA from these vaccines… by asking a pharmacist for used vials with a couple drops left… and they’ve posted them to GitHub… as a .docx.pdf.”
“A volume of 7 km^3 of olivine on 2% of the world’s tropical shelf seas could remove total yearly anthropogenic CO₂ emissions.”
“Why airlifting rhinos upside down is critical to conservation.”
A fun, extremely-knowledgeable conversation about market making.
A massive container ship got stuck in the Suez Canal.
“One of the failure modes of a mercury arc rectifier is that the two anodes basically have a motherfucking wizard battle and throw lightning at each other.”
“To produce this unique golden cloth, 70 people spent four years collecting golden orb spiders from telephone poles in Madagascar”
“Research is stopped on a system of space propulsion which broke all the rules of the political game.”
Apparently the first major SpaceX innovation was petitioning the GAO to keep NASA from spending all its rocket money on a sole-source, non-competed contract back in 2004.
A multitouch window management concept from 2009.
Speaking of ultra high aspect ratio machining, people are doing wire electro-discharge machining at home now.
Yeah PCR is cool, but have seen LAMP, a cheaper, isothermal DNA amplification technique that can provide visual results within 30 minutes from raw samples? I first heard about this from a video overview of COVID testing techniques and have since watched this 90 second animation about 20 times. Also this Italian guy developed CoroNaspresso a LAMP-based COVID test in a Nespresso pod!?!?
Two Paths to the Future proposes that, given an 80% chance of artificial general intelligence emerging before 2100, humanity’s best chance against it is to clone a million John von Neumanns. Yeah this feels absurd, but I don’t see anything obviously wrong with the argument itself.
Why videogame conversations are so terrible and how to improve them.
Shout out to polished, singularly-focused websites for doing exactly one thing, like generating toolpaths for laser-cut boxes.
“[P]eople with limited understanding of business think that business is all about making profits. But those who actually run businesses know that running a business is all about managing cash flows.”
Enzyme automatically calculates derivatives from LLVM IR (and so any program you can write in C, Rust, Zig, etc.). It feels like this will come in super-handy sometime, though I’m not sure for what yet.