The next project
Having finished the terrace
, I'm on to the next childproofing/renovation project
. I replaced a rotting wood deck railing with adult-spaced steel cable years ago
. I spaced them at 6" so that adding another set of wires would bring the gap down to the prescribed 3".
While steel cable opens the view, I like the look of redwood better.
$2k of redwood fence planks that took Blue Home Depot more than a month to assemble/deliver.
There's (already) a noticeable texture difference between the oiled and unoiled section of the recently-completed terrace structure. Since the deck railing gets blasted by the sun all day, I'm doing everything I can to keep it from deteriorating.
I dunked the boards in linseed oil using a crude bath
. The crudeness (4 mil painter plastic) made itself evident quickly as the assembly formed a slow leak
. It got the job done, but I could have probably used less oil and stain. Speaking of:
Will the linseed oil keep termites out?
- 200 8' redwood fence planks
- 12 gallons boiled linseed oil
- 2 gallons sedona red oil-based interior stain
I'm not sure, but a pincher bug that happened upon one of the oil planks perished in minutes, so I'm hoping more invasive wood stowaways will meet the same fate. And yeah, like with all SoCal lumber, the termites are free with purchase. Anecdotally, I saw termites in 4x6es survive rolled-on copper green (by later cutting the wood) but these planks are considerably thinner.
The assembly line:
- Vigorously brush and leaf-blower each board
- Soak in the oil/stain bath for a few minutes (because 200 boards)
- Prop against the deck to drain excess
- Move the last-propped board to a secondary drying location
- Goto 1
With all the boards in one place, I think my next step will be to fire up the paint sprayer and hit them with water-based sealer
left over from the last project.
Since the deck supports will soon have planks on their sunny side, I threw on some joist tape.
What is a recession? While some maintain that two consecutive quarters of falling real GDP constitute a recession, that is neither the official definition nor the way economists evaluate the state of the business cycle. Instead, both official determinations of recessions and economists' assessment of economic activity are based on a holistic look at the data...
The top stories of the week should have been dominated by earnings and interest rates announcements, instead it was that the White House doublethinked the word 'recession'
. If I had to guess, when people think 'recession' they think breadlines and that's not good for midterms.
Out of the frying pan: Printing money
to fulfill campaign promises like "I'm going to make you all very rich" while sock puppeting "we don't expect inflation" through the FOMC
Into the fryer: "Recession? Depends on what your definition of 'is' is.
I'm (naively) a lot more bothered by inflation than recession. What's the harm in having our country's economic activity be about the same as last year
? Yeah, okay, per-capita GDP matters and population is always growing. GDP has to pace with inflation and national debt. Etc. Etc. Maybe those things should be addressed without the crutch of economic growth?
Trying to deflate speculation and keep inflation in check, the Bank of Japan sharply raised inter-bank lending rates in late 1989. This sharp policy caused the bursting of the bubble, and the Japanese stock market crashed. Equity and asset prices fell, leaving overly-leveraged Japanese banks and insurance companies with books full of bad debt. The financial institutions were bailed out through capital infusions from the government, loans and cheap credit from the central bank, and the ability to postpone the recognition of losses, ultimately turning them into zombie banks.
Adventuring has been a bit light with travel but we're still stabbing drakes and throwing rocks at vermlings.
The Lingering Swamp (personal quest)
Apparently my quartermaster pulled one of the most difficult personal quests: complete two scenarios in the Lingering Swamp
. I looked it up and here's what I found:
Main scenarios: 19, 32, 45, 49. Random/event scenarios: 68, 79.
19 can be unlocked via 3->8->14->19 or 4->6->8->14->19.
We'd completed 19 so we decided to pursue 45, 4->5->10->22->35->45. Temple of the Elements (22) unlocked scenarios Battlements A (evil) and B (good). I didn't realize evil was the only path to Rebel Swamp (45), but probably would have gone good anyway. I guess that's blocked.
Since 49 is after 45, the only path to retirement
other than a lucky event/scenario draw is 3->9->11/12->16->24->32 or 3->8->14->7/13->20->16->24->32.
Having sided with the merchants in the tabletop version, the Battlements scenario was new to me. It was fun to have archers on our side
, even if it's to defend against endless waves of enemies...
... followed by the Prime Demon. It wasn't pretty, but we did it.
I finally got around to experimenting with concatenate layers. The only things that make them different from a normal sequential model:
- You use the input parameter to the layer constructor to specify its input, e.g. my_conv_layer = Conv2D(16)(my_dense_layer).
- You have to wire your model inputs/dimensions up differently.
For convolutional layers, concatenation simply stacks the output feature maps together
(so they have to be the same height/width).
Left to right: brightness input, hue/saturation input, output (really bad), brightness input with hue/saturation set to output values (what I would expect the model to produce).
I experimented with trying to recolor images: hue/saturation as one input, brightness as the other. You know, so I could create hdr-like
images without all that photoshop or dragging the hue slider. More charitably:
- This'd be like a Hello World for concatenating input images.
- A good model might creatively apply recoloring.
My models failed miserably, of course. The hue/saturation branch of the model couldn't forget its geography and the entire thing seemed really bad at minimizing loss
. I need to take another pass at it. Sometimes these things are total failures until the right thing clicks.
As suggested in the caption above, one way I cheated (extracted desired results from crappy output) was to simply use hue/saturation from the output and apply it to the input brightness
. Even if the model managed to spit out something recognizable, I might still want to use this technique to maintain the sharpness that is always lost when a network tries to reconstruct an image
Dall-e drawing PUBG coffin dance, demonstrating the lack of sharpness in images produced by small models.
On the subject of sharpness, I stumbled on this discussion of losses for image generation:
MSE is not a good indication of quality in image enhancement.
Why Mean Squared Error (MSE) is not a good indication of quality in image enhancement.
Using MSE or a metric based on MSE is likely to result in training finding a deep learning based blur filter, as that is likely to have the lowest loss and the easiest solution to converge to minimising the loss.
A loss function that minimises MSE encourages finding pixel averages of plausible solutions that are typically overly smoothed and although minimising the loss, the generated images will have poor perceptual quality from a perspective of appealing to a human viewer.
Eventually I'm going to bite the bullet and figure out how to write a good custom loss function that penalizes blurry output
Fun and games
The Exploration Society visited Harlan, home to a very tasty rice lager.
and I got into a very pretty UAZ chase in the red zone. Both jeeps and all three players got nuked.