Skills

On the wall…

There are five rifles on the wall. Four lever action and “Mrs. Pink”, an AR-15 platform with pink furniture. Don’t ask.

They are known as “Bear”, “Deer”, “Raccoon”, “Squirrel”, and “Mrs. Pink.”

Bear is a Henry Big Boy in 45-70. Deer is a Winchester model 94 in 30-30. Squirrel is a Henry Golden Boy in .22LR.

We do have bear around here, and I know that Bear has enough stopping power, with rapid follow-ups.

Deer has taken a couple of deer. She does a fine job with iron sights for me out to around 150 yards.

Squirrel isn’t used for squirrel hunting, but damn he’s fun to shoot.

That leave’s Raccoon. Raccoon is a Rossi R-95 in .357 Magnum. She eats .38 special just fine. She is a little loose where the stock attaches to the receiver, but she will put rounds on target out to 100 yards with no problem.

The lever action in .357 is a nice, mid-weight, rifle. I’ve used it for taken fat raccoon and opossums. One shot and they are down.

She is easy to reload for, and it is easy to police up all the brass. I cast hollow point bullets for her and have some commercial bullets for her as well.

All in all, she is a great rifle.

There is a matching wheel gun in .357 magnum. I don’t have enough time with that revolver. It is more than capable of putting rounds on target, I’m not. It doesn’t shoot like my Sig nor my 1911s.

Would I recommend an R-95 for a first-time gun buyer? No.

They don’t have a great reputation. The loading gate is nasty sharp, it needs a little care to get it to function easily. I found that finding ammo for it was a bit of a pain. With reloading, it is a joy.

Mrs. Pink as a red dot on her. She belongs to my wife. We run the manual of arms every so often, but I figure she has 30 rounds before she needs an assist to load the next magazine. But I know that those 30 rounds are going exactly where she wants them to go.

The iron sights on the four lever guns work fine for me today. I have another 30-30 that has a scope mounted on it. I need to spend a few dollars to replace the scope with something modern and then sight everything in.

All in all, those rifles make up the “go to” when needed now.

The other part of this are the LBV that are available for use. Each vest has 6 30 round mags of 5.56, at least 2 spare mags for the pistol that goes with the LBV, and a first aid kit.

Past Plans

When I was considering buying my first firearms, I was looking at “what happens if…” My thought process was based on the concept of availability of ammo after the fall.

That lead me to an AR-15 in 5.56, an AK type rifle in 7.62×39, a 9mm Glock, a bolt action in 7.62×51, a black powder revolver, and a black powder rifle.

The firearm I have the most fun with, to this day, are the AR’s. They are gentle on the shoulder, the ammo isn’t too expensive, they are easy to carry and are just plain fun.

Though I will note that they eat ammo rapidly. It isn’t an unusual range day when I won’t send 300+ rounds down range.

I still have .308 from the original ammo buy. I’ve augmented it with reloads, but I don’t feed much through that rifle.

Of course, once I started buying firearms, it hasn’t really stopped.

Regardless, as more than one person has said, when the SHTF, the best firearm is the one you have.

Prepping – Leggy Seedlings

See those seedlings hardening off in the header image? Those are from a garden I was growing about a decade ago. They’re strong, healthy seedlings. They’re ready to be set into the ground to thrive and grow and make veggies for us.

And then there’s this specimen:

See how it’s falling over, and it only has a single set of leaves? Those aren’t even leaves, by the by. Those are called cotyledons, or “seed leaves.” They’re just there to get the plant going. By the time a seedling is as tall as this one is, it should have several sets of leaves. So why is this poor thing falling over and not growing better and stronger?

The first thing it’s lacking is probably light. Most of the time, when we’re starting seedlings indoors, we’re short on light. There are plenty of ways to fix that, of course. You can put them on a rack with a light right above them, and put a timer on it to give them 12 hours a day. That will fix the light problem, even if they’re cheap light strips. What if you don’t have a rack with light strips, though? Well, you can make do by giving your plants as much light as you can. I have a “daylight lamp” that I use in the winter to help with depression. This time of year, I no longer need it, so I give it to my plants. I move it around, so they all share in the glory of it. Any lamp put close enough to the seedlings will help.

Your seedlings also may be too cold, or alternatively, too warm. Most seeds like to germinate between 65 and 80° F, so if your home goes below that at night (or, like mine, never gets that high even in the daytime), you  may need to pick up a seed mat. The mats aren’t that expensive, and you just place your seed trays right on top of them. They keep the temperature warm but not hot, and convince your seedlings that it’s time to grow.

A third option is that you are not watering your plants enough, or that you’re over-watering them. You can tell if your plants have enough water by feeling the soil they’re in. If it’s dry and flaky, you need to water them, stat! If it’s saturated and dripping, it’s probably too wet. You need moist soil that clumps when you take a handful of it, but that isn’t dripping and sopping wet. It’s my strong opinion that the best way to water seedlings is from below. The containers your seedlings are in should have several small holes in the bottom (and if there aren’t, add some), or be made out of porous material like paper egg cartons. The containers should be sitting in a waterproof container, either one designed for the purpose or whatever you have on hand. Pour the water into the bottom of the tray, and let the soil suck it up from below. This encourages strong roots, which is important for your plant. If there’s a tiny bit of water in the tray, you’re fine. If it’s an inch deep, you need to drain it out. I also keep a mist sprayer on hand full of water, and each day I will spritz my seedlings. This helps prepare them for the rigors of a rainfall when they get outside.

The last option for helping seedlings develop strong stems and avoid them being leggy, is to blow a fan over them. This should be a VERY gentle fan, aimed above but not directly onto the seedlings. This simulates the breeze outside, which is part of what causes a plant to create thick stems and rigorous root systems. The fan, sweeping back and forth, will make your plants signal themselves to create more roots and stronger stems. Another method is to (GENTLY) brush over your seedlings with your hands each morning and evening. This need take only a couple of seconds, and should be done very carefully.  You don’t want to break or damage the plants.

If you get to the point where you need to thin out seedlings (an unfortunate thing but necessary), don’t pull them up. Pulling plants disturbs the soil and surrounding plants, possibly causing more than intended to die. Instead, cut them off at the soil line. The plant will die off and feed the soil, and you can feed the thinned plants to your chickens or bunnies (so long as they aren’t poisonous).

Ivory Ball Phython Snake curled up in the straw.

How Many Languages Do You Speak?

In computer languages, there are very few that are structurally different.

FORTRAN is like COBOL, which is like Pascal, which is like BASIC, which is like ADA, which is like …

Forth is not like those above. Nor is APL or Lisp.

Assembly languages can be used in structured ways, just like FORTRAN, COBOL, Pascal, and many others. It requires the discipline to use if not condition jump skip_label; do stuff in condition; skip_label:. The actual programming logic stays the same.

The two computer languages I dislike the most are PHP and Python. Both because they are weakly typed.

In a strongly typed language, you declare a variable’s type before you use it. The type of the variable is immutable for its lifetime.

In other words, if you declare a variable of being of type integer and then attempt to assign a string to it, it will barf on you during compilation.

In PHP, all variables look the same, any variable can hold any type at any moment. The type can change from line to line. And the language will do implicit type casting. It is hateful.

Python has all the same characteristics I hate in PHP, with the added hateful feature of using indentation instead of begin-and markers for blocks.

I’m lucky that Python has an optional typing capability, which I use consistently. The optional part is a pain when I want to use a module that has no typing information. When that happens, I need to create my own typing stub.

But the worse part of all of this is that they get jumbled together in my head. How many entries in an array? In PHP, that is determined by the count() function, in Python it is the len() function.

In Python, the dot (.) is used to access methods and attributes of objects. In PHP, it is the concatenation symbol.

I am tired of writing Python in my PHP files and I dread switching back to Python because I know my fingers will mess things up.

The Weekly Feast – American Goulash

Sometimes known as American Chop Suey (no idea why), this dish has been served in American homes since the mid 1800s. It’s usually a macaroni based ground beef dish. This week, I made Orecchiette pasta with Chris last night, and we enjoyed it in my American Goulash. This is my own recipe, and I recommend it highly!

Ingredients:

  • 16 oz elbow macaroni or fresh pasta
  • olive oil as needed for cooking
  • 1 large onion, diced
  • 2 cloves garlic, minced
  • 1 lb ground beef
  • 1 medium carrot, finely diced
  • 1 stalk celery, finely diced
  • dash of red wine
  • 2 tbsp all purpose flour
  • 28 oz (2 cans) diced tomatoes, any flavor
  • 1/4 to 1/2 cup tomato juice or V8
  • 1 tsp brown sugar
  • 1 tsp dried oregano
  • 1 tsp dried basil
  • 1 tbsp salt
  • ½ tsp black pepper
  • 1 tsp Worcestershire Sauce (optional)

Fill a large pot with water, add a dash of salt, and bring it to a boil. Cook the pasta according to package directions. If you’re using fresh, cook your pasta until it’s al dente, which can take anywhere from 3 to 7 minutes, depending on the thickness and overall size of your pasta. Drain the pasta, and set it aside.

In a heavy pot, add a bit of oil to the bottom and brown the ground beef. When the meat is thoroughly cooked and no pink remains, add the onions, carrots, and celery, and continue to cook until the onions soften and become translucent. Stir often, to make certain the mixture doesn’t stick to the bottom of the pot. If necessary, add a bit of olive oil or butter to the pot. Add in the garlic and cook for one more minute.

Drizzle in some of the red wine and deglaze the bottom of the pot. Make sure nothing is sticking to the bottom, and add more wine as necessary, but not enough to make it very wet. Sprinkle a tablespoon or two of flour over the ground beef mixture, and stir gently to incorporate it. The result should be a slightly sticky, somewhat gummy mass in the bottom of your pot.

Add in the tomatoes, sugar, Worcestershire sauce, and spices, and cook until it begins to thicken. Add in as much tomato juice or V8 as necessary to make the consistency similar to a thin gravy. Simmer for 2 minutes or so, until all the food is evenly heated. Pour in the cooked pasta, mix it in well, and add salt and pepper to taste. Let this simmer on a very low heat (or in your oven at 250°F) for about 15 to 20 minutes, checking often to be sure it isn’t sticking. If it’s too thick or dry, you can add a bit more tomato juice.

Serve this up with a bit of crusty bread or a side salad for a delicious and hearty meal.

Prepping – Keeping out the Neighbors

You can take that title however you like. When I first wrote it, I was thinking of the four legged kind of beastie that sneaks in and eats your broccoli while you’re sleeping. However, if we’re talking prepping, there’s a legitimate chance that the critter in your garden is two legged and armed. So let’s unpack that!

Regular Critters

The most common form of problem in your garden is likely to be pests. These include, but are not limited to, ants, roaches, moths, hornworms, tent caterpillars, aphids, and bunches of other multi-legged beasties, as well as mice, voles, moles, possums, raccoons, deer, porcupines, and other wild and domestic animals. A cat that digs up your kitchen garden in order to use it as an outdoor toilet is just as destructive as the raccoon that takes out whole plants.

Poison is one method of getting rid of pests. It’s not a method I recommend, only because I know that poison can be transferred from its intended victim (the mouse or raccoon) to unintended victims such as owls (who keep the pests down naturally and should be cared for and preserved) and local cats and dogs (who sometimes do eat pests outdoors). There’s also a possibility that vegetables covered in poison might be transferred to deer that we harvest later for our own eating pleasure, and that would be a Very Bad Thing, indeed. When it comes to mice in winter, I occasionally lift this personal ban, only because I dislike mouse poop in my kitchen more than I dislike the thought of accidentally killing an owl.

Traps are another method, and while they do work, they’re a LOT of work. You can dead-trap or live-trap, but regardless, you have to deal with what’s in the trap on a daily basis. Depending on what you’ve caught, it can be problematic. Consider the person who accidentally captures a skunk in a “have a heart” trap, and then has to figure out what to do with the stinky critter. When it comes to live traps, again, I really don’t recommend it. When you unload your traps, your victims have the ability to just wander home and do more destruction.

Read More

The Weekly Feast – A Springtime Feast in 1750

I’m at the Fort this weekend (if you’re in the area, come on over and visit!), presenting life in the early spring in a cold environment. I’ll be staying all weekend, with no running water (it’s turned off until all danger of frost is gone) and little electricity (the gift shop has some). I decided that the food I was going to make should reflect the environment I’ll be in, and so these meals are ones that conceivably could have been served at the Fort in the spring of 1750.

Soup Meagre

I’ve adapted this from Hannah Glasse’s recipe of 1765. I find it amusing how closely it resembles the Green Soup that I made a couple of weekends ago for a Viking reenactment I did. There’s never much food in the spring, and what you can get your hands on has to “make do” until you can plant and harvest crops. It’s a tough time of year! This is a very plain soup, but with the seasonings, would probably have been quite the treat. Early greens in New England would include ramps, asparagus, watercress, fiddleheads, dandelion greens, and things we consider weeds like stinging nettle, onion grass, and dock.

Ingredients:

  • 2 tbsp butter
  • 1 onion, finely chopped
  • 2 celery stalks, cut into 1- to 2-inch lengths (“half as long as your finger”)
  • 6-8 oz mixed greens, (spinach, lettuce, arugula, etc), chopped if large
  • 3 tbsp parsley, finely chopped
  • 1 tbsp flour
  • 2 to 4 cups broth
  • 1/2 tsp salt, plus more to taste
  • 1/2 tsp black pepper, plus more to taste
  • 1/4 tsp ground mace and/or nutmeg

Melt the butter in a large kettle or Dutch oven over medium heat. When the bubbling has subsided, add the onions and cook for about five minutes, until transparent.

Add the celery, greens, and parsley, stir, and cook over medium heat for about 10 minutes. Sprinkle the flour over the greens and stir to blend. Add the broth, salt, pepper, and mace, and stir well. Simmer the soup over medium-low heat for about 30 minutes.

Taste and add more salt and pepper, if desired. Serve warm, with bread if you have it.

Notes from Mistress Allyson: If you want to add a bit of protein to this meal (something that would have been in high demand in the 1750s in spring), try some beans or a bit of salt pork. Beans get added right before simmering. Salt pork should go in with the butter at the beginning.

Read More

Person holding a small paper

Remote Repositories for GIT

Definitions

When working with git, there are several areas we might be discussing. The most important areas are “working directory”, “staging”, “local repository”, and “remote repository”.

A repository without a working directory is called a “bare repository”.

The working directory is a location where your files live, with a reference back to the repository. It is possible to extract the files without keeping a reference to the repository.

Most working directories, and what we will assume for our discussion, have the bare repository within the .git directory (folder).

A remote repository is normally a bare repository located somewhere apart from your local working directory. This could be a different directory on the local machine, or it could be located on a remote, network connected system.

Creating a Local Version of a Remote Repository

This is what the remote repository looks like. A pretty simple version.

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"

We don’t copy it, we clone it. The implication is that what we have in the local version is the same as what is in the remote version.

git clone ssh:git@github.com/author/book.git

This creates a directory named “book”, it clones/copies the bare repository from GitHub and places it in “book/.git”. It creates a “remote” reference within “book/.git/refs/remotes” named “origin”. With “origin” it creates a copy of all the branches that are in the remote repository, in our example, just “master”

The clone command then checks out the working directory into “books”. This would be the files “chapOne.md”, “chapTwo.md”, and “chapThree.md”. It creates a file in “books/.git/refs/heads” named master with the commit hash (identifier) of “0ccd79797”.

This is a copy of refs/remotes/origin/master.

Our diagram now looks something like:

%%{ 'gitGraph': {'rotateTag': true}}%%
gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797" tag: "origin/main"

At this point, you start work on your three files over two commits, your local repository now looks like:

%%{ 'gitGraph': {'rotateTag': true}}%%
gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797" tag: "origin/main"
   commit id: "7d918ddc0"
   commit id: "f4b0d4086"

Meanwhile, your editor has fixed some typos and grammar errors and pushed those to the remote repository.

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"
   commit id: "cd6023b24"
   commit id: "ac6d2dd15"

These two look the same, but notice that the last two commits have different values/hashes. This is because they are different.

Since you are done with your edit, you attempt to send your changes back to the remote repository named “origin” via a push command. git push origin main This fails because there would be two versions of the repository if you did this, there can be only one.

To correct this, you first fetch an updated copy of repo.

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"
   branch origin/main
   checkout origin/main
   commit id: "cd6023b24"
   commit id: "ac6d2dd15"
   checkout main
   commit id: "7d918ddc0"
   commit id: "f4b0d4086"

This is done with git fetch origin This command updates only the bare repo.

We now need to combine these two branches, this is done with git merge origin/main After which our local repository looks like.

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"
   branch origin/main
   checkout origin/main
   commit id: "cd6023b24"
   commit id: "ac6d2dd15"
   checkout main
   commit id: "7d918ddc0"
   commit id: "f4b0d4086"
   merge origin/main id:"119c29222"

We then add some more to the story line and make one more commit.

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"
   branch origin/main
   checkout origin/main
   commit id: "cd6023b24"
   commit id: "ac6d2dd15" tag: "origin/main"
   checkout main
   commit id: "7d918ddc0"
   commit id: "f4b0d4086"
   merge origin/main id: "119c29222"
   commit id: "040e63c01"

We do another fetch, there is nothing to do as nothing else has been added. We then push our commits back to the remote repository. git push origin main

gitGraph
   commit id: "d6458df1e"
   commit id: "0ccd79797"
   commit id: "cd6023b24"
   commit id: "ac6d2dd15"
   commit id: "7d918ddc0"
   commit id: "f4b0d4086"
   commit id: "119c29222"
   commit id: "040e63c01"

Because I’m not the program, there might be some small ordering issues in the final commit.

The point in all of this is that all of this magic happens behind the scenes. The program can do most merges with no assistance from you. In the rare cases where there is a merge conflict, it is relatively easy to manual merge the changes.

A merge conflict happens when two commits modify the same line of code. In your version, you had “Ciliorys hat” originally. You modified it to be “Billy-Bobs hat” Your editor had changed it to “Cilory’s hat”.

Now you have two edits to the same line. Git says, “You figure it out.” and shows you the two versions of the line, in context. You can pick one version or the other, or put in an entirely different version.

You choose the third option and put “Billy-Bob’s hat”. The world is good.

Conclusion

git is powerful. This discussion barely touches on the power of git.

There is an entire process of modifying code by “forking” a repository. When you are finished with your modifications, you can contribute them back to the original repository with a “Pull Request”.

Git has multiple methods of inserting code review and other tools into the process.

It is so powerful, It can be used to create a full wiki, on the fly. The raw files are served as wiki pages.

There is a method of doing a binary subdivision to find bugs that were introduced in the past. There is a method of tracking who introduced an errant line of code.

There are tools for pulling a commit out of the middle of a branch and applying it to a different branch, without taking the rest of the modifications.

In general, there only about a dozen commands that a user needs to know to work with git.

If you would like to work with git, there are communities ready to help you, there are multiple cloud providers that will allow you to host your repo on the web.

Person holding a small paper

Source Code Control for Beginners

Update

My introduction to source code control came at University. The name of the program was “update”. It took an “update deck” which described lines to remove, by line number, and lines of code to insert.

This format allowed us to inspect the code that was actually being changed, as well as the surrounding code. Every line of code I wrote for the Systems Group that was installed went through three levels of code review and QA testing before going live in the system.

Having those change decks helped in the review process. As a side note, the author’s initials were attached as a note to the right of every line of code we modified. Easy stuff.

After a change deck was accepted, it became part of the “installed version” of the software.

One of the powerful features of working with change decks is that two (or more) people could be working on the same piece of code and unless their changes overlapped, they could be applied independently.

RCS

When I left University, I started working with the BRL CAD project. This introduced me to the RCS system.

RCS was something like “update” but not quite. And you didn’t think in terms of “change decks”. That was handled behind the scenes.

You had a directory (folder) in which you had your code. You also had hidden files that stored the RCS history of the code.

By default, files were stored read-only. You could read them, you could compile from them, but you could not modify them.

To modify a file, you needed to first check out the file. When you checked out a file, it was “locked” to you and nobody else was allowed to modify the file.

You made the changes you wanted to the checked out files, then you tested. When you were happy that your code worked, you checked in the file you had checked out.

This is great when modifying a single file, but if you are modifying more than one file to accomplish your fix or enhancement, you have to check in each file in a separate operation.

There was no linkage between the files to indicate that all the changed files needed to be processed as a gestalt.

When you were ready to make a release, you had to do some magic to mark each file as being part of that particular tag. Then, at a later time, you could check out that entire tree and work on it as if it was the day of the release.

RCS did magic behind the scenes to figure out the “delta” between the checked out code and the original. This was equivalent to the “update deck” I was used to from University Days.

To work in a collaborative methodology, you would have a single “working directory” with everybody on the team having read/write privileges to the directory. If you were working across multiple machines, each machine had to use the same shared directory via a network file system. (NFS at the time)

At one point, I was working on BRL CAD on my home machine. I did not have enough space on the drive to copy the entire RCS tree to my local drive, so I was using NFS over a 28.8k dial-up modem.

Compile times ran about 3 days. And if anybody changed one of the “big” include files, I would have to start the build over again.

If you were working on a copy of the source code, you would extract a patch file from RCS to submit back to the master RCS directory.

It felt easy at the time, but it wasn’t as easy as it seamed. We just didn’t know what we didn’t know.

CVS

CVS was the first major paradigm change in source code control for us. The basic use was the same as with RCS, but they had changed the layout.

You now had an explicit directory, CVS, which contained the history files. When you checked out files, the lock was done in the CVS directory.

In addition, you could check out the files read-only (no lock) remotely from the CVS directories and then checkout with a lock, edit on the remote system, then check in your changes.

This was a game changer. We no longer required a network file systems.

Unfortunately, we had some of the same issues as we had with RCS. The main one being that only one person could check out/lock a file at a time. With team members working nearly 24 hours per day, it was a pain when the morning dude wasn’t available at 2237 to release a lock.

SVN

SVN solved most of the known problems with CVS. It had the concept of a remote repository, it allowed multiple people to work on the same file at the same time. It had better branch and tag capabilities.

All in all, it was a vast improvement.

The two primary weaknesses were no gestalt for files and very slow check out of branches and tags away from the main trunk.

I remember using SVN. I had to use it just a couple of weeks ago. I don’t think I ever fell in love with it. It was a step-wise improvement over CSV.

git

Git is my favorite source control system. I understand that there is another SCS, but I can recall its name at this point. I’ve not used it.

Git changed the paradigm we use for changing the repository. Whereas all the previously discussed SCS’s work on a file by file basis, git works on a “commit” basis.

Even if you are working in a collaborative environment, you work on your personal repository (repo). We will get to collaborative environments shortly.

In the simplest form, you create a “working directory” which you populate with your code. That could be a book, a program, an application, or a web page. It doesn’t matter. Git doesn’t care what the files contain, only that they be text files.

Git can work with binary files, but that is not our focus.

Once you have your initial contents, you create your repo with git init. With this magic command, git creates all the required files to track the history of your project.

Let’s say you are working on a book. You have placed each chapter of the book in a separate file. One of your characters is named Cillary Hlinton. Your editor tells you that the name is just too close to a real person, and he would rather not be sued. He asks you to change the character’s name.

Under update, RCS, CVS and SVN, you would check out individual files, change the name to “Billy Boy” and then check in your changes. When you have made all the changes, you are happy.

The issue is that there Chapter One is on revision 44, Chapter Two is on revision 37, and Chapter Three is on revision 48. How do you figure out the revision from just before you made the changes?

With git, you do not check out files and lock them. Instead, all files are ready for you to modify. You just edit the files and change the name.

Now you have chapters one, two, and three that have been modified. You group them into a single commit by adding them to the staging area. git add chap1.md chap2.md chap3.md

You can do this on one git add or multiples, in one session or multiple sessions. At some point you will be satisfied with your collection of changed files.

At that point, you commit the changes. You will be required to supply a message.

Each of the following circles represents a commit.

Before Name change

After the name change

If we want to see the version before the name change, we can check out commit 4. When we do, all the files are changed back to the version before adding your name changes.

This makes it easy to find one particular point where the state of the book is one way and in the next commit, all the changes have taken place across the entire book.

The other major improvement that git brought was fast branches.

Branches

Here we see two branches added to the repository. The first “HEAD” is a special branch. It represents the commit associated with the working directory. It is manipulated implicitly instead of explicitly.

“master” is the default branch until “rrracist” was applied, so some repos now use “main” instead of “master” branch.

This ability to create branches rapidly allows us to make and destroy branches at will.

We are going to create a new branch, “editor” for our editor to work on. Meanwhile, you are continuing work on chapter four.

Editor and Master branches

And here is where git shows another of its powers, the merge. With the ‘master’ branch checked out, we merge the editor branch, fixing all the little grammar and spelling errors. git checkout master; git merge master

After Merge

With this merge completed, the master branch contains all the work done in the editor branch, but the editor branch does not have any of the new work done on master. To synchronize the editor branch with the master branch we do git checkout editor; git merge master.

After merging master into editor branches

If there is no more editing to be done, it is acceptable to delete the editor branch. No code will be lost.

Because the ability to branch and merge is so quick and powerful, it is normal procedure to start a new branch for each issue being addressed in a project. When the issue is resolved, the new code is merged into master or discarded.

Remote Repositories

Is a tale for another time.

Conclusion

If you can use a source code control system to track your work and changes, do so. It makes life so much easier in the long term.

Prepping – Prepping Outdoor Beds for Sowing

I’m watching the snow melt outside. It’s SLOWER than watching paint dry. Ah well. It’ll be gone soon, and then I can get to work on outdoor stuff. For now, it’s time to plan the outdoor garden space, and decide which things are getting direct sowed. In other words, which things go right into the ground (or raised bed/outdoor container/plant tower/etc) versus those that get started indoors because they’re too delicate for the cooler weather?

The first seeds that I’ll be direct sowing will be radishes, beets, carrots, peas, and spinach. These are all hardy crops, and they like the cold and damp that come along with early spring and late fall. They’re also staples around here. Well, not the beets so much. I like them, but most of the rest of the family doesn’t. That’s fine; more for me.

You’ll note that the beets and carrots and radishes are all what we call “root crops.” This means the edible part is under the ground. Generally speaking, for early spring crops you want to look for ones that say, “Plant seed outdoors as soon as the soil can be worked.” This means that a late frost in the spring won’t destroy your plants, and that’s a very good thing when you live in the northern part of America, or any part of Canada. Most root crops can be planted early, but always check the seed packets (or online if you don’t have the packets).

Before you can sow seeds directly into the soil outdoors (regardless of whether it’s in the ground, in a raised bed, or in a container of some kind), you have to prepare the garden bed. This takes several stages, and is best started as soon as you can get into your garden area. I can’t yet, because we still have snow deep enough to cause issues and I’m not shoveling out the garden. You can speed this up by covering your garden beds with black plastic each fall right before the snow flies. This keeps down on weeds, and also allows the beds to warm up earlier. Once your beds are defrosted and workable, you can begin planting. This is one of the main joys of any kind of raised bed.

Read More

Common raven (Corvus corax) eating dead chicken. Wild life animal.

Eating Crow

Or “You don’t know what you don’t know.”

The short of this is that I’ve been building PCs for years. They are LEGO blocks. You make sure the parts will fit together, and it all just works.

As an example, I “knew” that LGA sockets were for Intel CPUs. Last night I learned that LGA just means the motherboard socket has the pins. PGA means the CPU holds the pins.

How did I learn this? I was researching AMD CPU sockets and learned that the AM4 socket was of the PGA style, while the AM5 socket is of the LGA type.

I didn’t know what I didn’t know.

We run a local data center. It is still a work in progress. We have enough disk space, but not enough redundancy. We have some compute servers, but not enough.

We try to do some upgrade every month, trying to improve things. The last improvement was another node in the Ceph Cluster.

After spending weeks researching, I found a 4 bay NAS enclosure that took Mini-ITX motherboards. This felt just about perfect.

It uses a flex style power supply, which is balanced for the actual load of 4 HDD and a motherboard. 350 Watts is what I went with. Thus, it draws less power than older machines.

Finding a Mini-ITX board was another research hell. What I wanted was MB with 4 SATA 3.0 ports, 1 or more SFP+ ports, one gigabit Ethernet port, at least 16 GB of memory and NVMe support for 512 GB of storage.

I couldn’t find one. I haven’t given up, but I haven’t found one yet.

After searching, I found a Mini-ITX MB with an LGA 1155 socket, 4 SATA2.0 ports, a 10/100 Ethernet Port, 2 DDR3 slots (16 GB), and a PCIe slot.

This might seem low end, but it meets our needs. HDDs only require 3 GB/s to keep up. We would need 3.0 if we were using SSDs.

The 10/100 is useless for moving data, but meets our needs for a management port. All in all, a good choice.

When all the parts arrived, I couldn’t get the MB installed. The fan was too tall. I got a better cooler that was a low profile style. When that came in, I installed the board. It was painfully tight getting everything in. Took me over an hour to get all the cables hooked up just right.

Everything went well until I went to put the cover back on. At that point, I found the cover didn’t fit “because the case had the motherboard too close to the edge.”

I fixed that in the machine shop. Grinders and cut off wheels to the rescue.

Everything goes together.

After everything is configured and running, I slap a drive into the case and it works. Wonderful. Final step? Install the SFP+ network card.

It doesn’t line up. The damn thing doesn’t line up with the slot in the back.

After mulling it over for way to long, I made the cut-out in the back wider and moved the standoffs. Machine shop to the rescue.

Except I had a bad network card. Easily fixed via a replacement. No big deal.

After over a month of fighting this thing, making massive changes to the case. Taking it entirely apart to get the motherboard in, the machine is now in production.

Yesterday the motherboard for an upgrade arrived. The case I bought to hold it had the PCI slot moved over. This looks like it will all just work.

Except that when I go to install the MB, I can’t get it to fit into the case. No big deal, I’ll take this case apart too.

But the board doesn’t line up. It doesn’t line up with the standoffs. It doesn’t line up with the back slot. It doesn’t even line up with the onboard I/O baffle.

At that point, I measured my Mini-ITX board. It should be 170mmx170mm. This board is not. It is 0.8 inches to wide. It isn’t a Micro-ITX nor is it a Mini-ITX. It is some none standard PoS.

I’m spitting mad at this point. I’ll put everything back in boxes until the new MB arrives. When it does arrive, I’ll be able to retire an older box that has been holding this data center back.

Everything now fits.

It wasn’t the case that was the issue with the last build. It was the motherboard. Time to update the reviews I wrote.