Haha yeah. Fancy name for a spike with a pressure gauge on top. You can do the same test with a piece of rebar sharpened on one end. When it gets hard to push into the soil that's about 150 psi, the limit of what plant roots can push through.
That one may have been a bit of an outlier but I've been seeing 12-16" before hitting 150 psi when testing with the soil penetrometer in that bed.
No dig for the last few years, and applied some high-quality compost last year. I think the good biology in the compost is building up the soil food web in the soil.
https://void.cat/d/6zJseJek9a6dAmmgsBZYUv.webp
It might helpful, or at least interesting, to know how a stable diffusion checkpoint was trained. I grabbed this script from reddit and packaged it up for easier use. It compares two checkpoints and shows the tokens with the largest difference between them. These are likely to be the words used more frequently during fine tuning.
The image was generated with 11 of the top tokens from the Photon checkpoint and a random prompt from the One Button Prompt extension. Three of the Photon tokens were: fancy steel mask, and the rest were nonsense words or word fragments: zzle elis eha fol hep spon abia wbo
The artists selected by One Button Prompt are important to the style, but the image style resembles neither of the artists.
I generated txt2img at 768x768, adjusted levels and curves in gimp, and brought it back to img2img for upscaling with ControlNet Tile and Ultimate SD upscale.
txt2img prompt:
zzle elis eha fol hep spon abia fancy steel mask wbo, art by Myoung Ho Lee, (art by Jacek Yerka:0.7) , landscape of a Maximalist Costa Rica, roots with Herb garden, at Midday, Ultrarealistic, Nostalgic lighting,
Steps: 30, Sampler: DPM++ SDE Karras, CFG scale: 3, Seed: 2631022029, Size: 768x768, Model hash: ec41bd2a82, Model: photon_v1, Lora hashes: "add_detail: 7c6bad76eb54, epiNoiseoffset_v2: d1131f7207d6", Version: v1.5.1
https://void.cat/d/6zJseJek9a6dAmmgsBZYUv.webp
It might helpful, or at least interesting, to know how a stable diffusion checkpoint was trained. I grabbed this script from reddit and packaged it up for easier use. It compares two checkpoints and shows the tokens with the largest difference between them. These are likely to be the words used more frequently during fine tuning.
Easily pulled this dandelion out of my garden bed. I think my soil is improving.

#soilfoodweb #permies #permaculture #gardenstr
GM. Libraries give value for your taxes, government doesn't.
Cherry tomatoes roasted in the oven are delicious, and they can be preserved in oil.
compost, please
https://void.cat/d/9k158MMUn2Z5F3zhb5BYto.webp
I didn't intend to delete the post of the beautiful garlic, oops. To make up for it, here's a beautiful red dragonfly I met in the garden today. It was so busy chowing down on something it caught that it didn't mind me sticking a camera in its face.
is it good now? last time it murdered my phone
i am become nostr

saviour of worlds
Why not fight both at the same time? Building up living soil will sequester huge amounts of carbon and grow more nutritious food leading to better health.
Mixed reality has a lot more short term potential than VR.
Stable Diffusion for images. Automatic1111 is the most popular, Vladmandic is another version. Olivio Sarikas and Sebastian Kamph on YouTube are good resources.
I experimented with a project called dalai for ChatGPT style text generation but haven't followed that as closely.
Peas of various maturity and garlic scapes immediately fresh from the garden at the same time, into a homemade pea soup. Cooked the most mature starchiest but still plump peas, the longest in a chicken broth base from โBetter than Bouillonโ. While cooking, cut up the garlic scapes into 1 inch lengths and the most tender green pea pods into inch or smaller. Added seasoning, in my case, garlic and onion powder, oregano, and a bay leaf. Added scapes and tender pea pods to pot, cooked till tender then Immersion blended together (Not the bay leaf). Then added sweet green peas last, left them whole unblended, cooked till tender. Pepper and salt to taste. And mixed in some olive oil at the end. I think the secret here is that these are โsugar snapโ peas. Iโm on my third bowl. Turned out better than I expected.
#foodstr #grownostr #permiculture #permies
https://nostrcheck.me/media/public/nostrcheck.me_5213774681815818921688405409.webp
https://nostrcheck.me/media/public/nostrcheck.me_7565888995909139461688405409.webp
Very nice, all my scapes have already been eaten but I'm drowning in snow peas now. Immersion blenders are great for soup. You can blend some of it and leave the rest for some variety too.
I've been excited about the Lynx R-1 mixed reality headset since I first heard about it, and it just keeps getting better. Open source, and the hardware is as open as they can make it. Much lower latency than others, more brightness, raw access to sensors and frames.
They didn't design it to help people with low vision, but check this talk out. Even without specific software this thing is giving people their vision back. Think what it could do with more software features to aid vision.
https://youtu.be/rWzVX11hF_8?t=102
#grownostr #mixedreality #ar # vr #xr
Sorry, embed didn't use the timestamp. Skip ahead to ~1:30 for the talk.
I've been excited about the Lynx R-1 mixed reality headset since I first heard about it, and it just keeps getting better. Open source, and the hardware is as open as they can make it. Much lower latency than others, more brightness, raw access to sensors and frames.
They didn't design it to help people with low vision, but check this talk out. Even without specific software this thing is giving people their vision back. Think what it could do with more software features to aid vision.
https://youtu.be/rWzVX11hF_8?t=102
#grownostr #mixedreality #ar # vr #xr
