Regardless of how patchwork a fabric our control systems may represent, the thread that binds them is an Ethernet connection, either directly or through some nature of hub, gateway, router, bridge or similar technology. Unlike contemporary home automation systems, our controllers do not communicate through a Cloud (jargon for a server at somebody else's address over which you have no control and which you cannot directly protect)and a handset is not what we consider a controller.
So where are our controllers? They may seem to be everywhere, but most of the control happens in an equipment rack located in a storage closet in our bonus room, above the 3-car garage. We create “Pi stacks” with as many as 16 Raspberry Pi 3 Model B boards in each stack. The photo at the left shows one such stack.
Tiny hinges in the front and rare-earth magnets on the rear bracket let us rotate the stack upward for access. The stack you see in the picture is mechanically complete but not yet fully integrated; there are 2 16-Pi stacks and one 5-Pi stack on this rack shelf; the white stack at the back holds the (wired) Ethernet switches that connect these stacks to the Controller intranet. For our Ethernet color coding, cables in the Controller intranet are white; the black cables connect the the Raspberry Pi micro-USB power ports to provide 5 VDC power from 3 regulated supplies (named Huey, Dewey and Louie) on the next shelf up; we added a digital voltmeter to each of these so we could keep an eye on their output. 32 of the 37 Raspberry Pi boards on this shelf connect to the Controller intranet; 5 others connect to the red-cable Drawbridge intranet, which is allowed limited WAN access for special functions, like retrieving relevant forecast data (by API) from the major weather services.
That's on the front of the rack; we built it 40 inches deep so the back is also a useful facility. The dual-WAN router is up top; the blank ports are lightning and surge protectors for the PoE (Power over Ethernet) lines that connect to the IP cameras on the black-cable Surveillance intranet. A few of the switches and some power distribution units are also visible here.
We decided to build rather than buy the NVR (network video recorder) at the heart of the surveillance system; Blue Iris software affords it more features and more flexibility than most commercial NVR systems at a dramatically lower cost. This is a rack-mounted Windows PC and is the only device other than the firewall that connects to both the camera network and the controller network. A Silverstone chassis designed to fit in a 4U rack space houses it and allows our MegaRAID controller to create a very trustworthy RAID 6 array with as many as 8 drives. PNY provided a GeForce 960 graphics card to drive a big screen on the other side of the cabinet, and Accele provided a small flat-panel HDMI monitor that we attached to the front of the case.
The rack also houses an always-online sinewave UPS, several PDUs (power distribution units), the firewall, Gigabit PoE switches for the surveillance camera connections, a Gigabit switch for the wired Ethernet controller connections, a rack drawer for the NVR keyboard and pointer, DC power supplies and several other controllers and interfaces. It also houses the MySQL server and backup drive arrays for it and for the NVR.
In addition to the units in the rack, and those in the CAP kits, other Raspberry Pi Model 3 Model B controllers live where they have work to do: at the apron of the driveway, in the garage, at the front door, near the irrigation controller, at the cooktop and so on.
An enclosed Sanus rack, on wheels, lives inside one of the ersatz lamppost pillars near the apron of the driveway, connected by optical fiber from the rack in the bonus room to an outdoor-qualified Microsemi PoE (Power over Ethernet) switch (shown at the right of the top paragraph)In addition to duties related to activities at the apron of the driveway, it also connects to sensors mounted inside the mailbox.
But our discussion of where controllers are located goes beyond their physical location.
Raspberry Pi hither and nigh
We know that some information will need to be centrally stored and that we need to provide several ways of weaving together the various controller operations, including (please look them up since they will be tedious to make everybody read through here): JSON, MQ, MySQL, NNTP, ONVIF, Bluetooth Beacon, XML, SMTP, http request (wget and curl), active HTML and Web page serving.
We have to (for logins to happen) name our controllers; here's the list we started but in reorganizing for a cybernetic architecture, a more complete list will be impossible to finalize until we are further into the programming tasks (because when a controller starts getting too many jobs to do, it makes more sense to add another controller than to overtax any of them).
Supporting the congregation
With most of the upstream (non-endpoint) controllers congregated in one place, we have the ability to add collective protections, including battery backup, surge protection, hot spares, an on-hand workstation for troubleshooting and more.
© Copyright 2016, 2017 and 2018 Newstips, Lord Martin Winston and J2J Corporation; all rights reserved
This site needs a larger screen and is not optimized for phone or tablet viewing