Create a Chromebox with Peppermint Linux

Over the past fifteen years, I have needed to upgrade my computers less and less. In the late ’90s through early 2000’s, every couple years my motherboard/CPU/memory were so horribly out of date that the latest software updates almost begged me to upgrade. However, I built my last PC over five years ago (ASUS board, quad-core AMD processor @ 2.4GHz, 4GB memory), and it continues to steam along through every OS I have installed. In fact, even single or dual core PC’s that date back to 2006 still have plenty of life in them. The reason is simple: computers have steadily become fast enough for the basic applications we use daily — email, web browsing, and office applications.

Of course, back in the mid-2000’s, there was no such prevalence of the cloud, much less any type of browser-based applications that lived within it. Applications and their data were stored on the PC, and thus required sufficient local horsepower and storage. Jump ahead to 2014 — it’s a very different story. Dropbox or Google Drive synchronize your data into the cloud, and Google’s suite of office applications are good enough for most day to day activities. With the web browser becoming the most-used application, the requirements on a PC that is already fast enough are minimal: you can conceivably get by in life with only a web browser. In fact, the bulkiest “application” that slows down the machine could be considered the operating system itself! (Looking at you, Microsoft.)

This is the premise of Chromebooks or more recent Chromeboxes: design hardware and the OS to support a web browser in which the user does everything, and replace the standard desktop or laptop for a fraction of the cost. Companies like Google, Samsung, and ASUS are starting to sell these systems based on Chrome OS, a Linux-based variant. However, if you already have an older PC laying around, why shell out more money when you can repurpose it as a veritable Chromebox.

Peppermint is a Linux distribution that was developed under the same premise as Chrome OS, and is freely available. I discovered Peppermint a year ago when switching to Linux Mint as my distribution of choice. Mint is based on Ubuntu (itself stemming from Debian Linux), and in my opinion creates a more familiar and user-friendly desktop look and feel than Ubuntu. Peppermint, like Mint, produces a similar, familiar user experience, but aims at minimizing its own footprint so it run speedily on low-memory, lower-performance (older) systems. While Peppermint bundles some basic applications (including Dropbox), you could argue its primary app is Chromium, the open-source web browser project behind Google Chrome. More applications can be installed, but the baseline configuration is perfect for targeting a system that uses the cloud for productivity (e.g. Google’s office applications). In short, Peppermint has become my favorite go-to operating system, especially when breathing new life back into 6-8 year old hardware.

My first AngularJS web app, part 3

In my last installment, I rounded out the architecture of the Minecraft Free News web app I created for my son. Now let’s talk about the server-side implementation.

I host through AWS and serve via an EC2 instance with a typical LAMP stack installation. However, for this particular project, the MySQL and PHP were not required. Apache is used to serve the web app itself — the HTML, CSS, and JavaScript to the client. Thus, when browsing to on port 80, Apache is doing all the work. Easy enough.

After the client web browser receives the HTML/CSS/JavaScript from Apache, it executes the JavaScript which pulls the enabled RSS feed items from the various Minecraft-related websites. However, the JavaScript I wrote also connects to my server via HTTP on a different port to request a special JSON object. This object contains custom HTML information that can be inserted as the top headline (above the “Minecraft Free News” logo), and/or on any line in any column of the content area. Thus, to make day-to-day content modifications to the web app, I need not edit my JavaScript code served by Apache, but instead modify the JSON object being returned by Node. Note that this JSON object is very small: only about 2 KB.

A Node HTTP server is exceedingly simple to configure in just a few lines, and there are countless tutorials available explaining the details. My version is designed quite closely to the example authored in The Node Beginner Book which explains quite nicely and in some detail the reasoning behind its easily extensible design. So in my case, when the client’s browser executes the JavaScript, it connects to my server on the port which Node is listening, and receives the JSON object. The data from that JSON object is then incorporated in content and rendered by Angular. And viola, the web app is complete.

My first AngularJS web app, part 2

Picking up from my previous installment, I created my web app layout and styling using Bootstrap, heavily customized so it doesn’t look like the standard Bootstrap-built site. Now it’s time to start hanging the meat on the bones by integrating AngularJS and JavaScript. Note that in this and subsequent blog posts I will discuss the high-level design of the app rather than reviewing individual code snippets. Frankly, assuming you can code, I believe it’s more important to understand the design reasoning and notable learnings rather than wading through code to demonstrate syntax.

As you can see on Minecraft Free News, there are essentially two major areas on the page: the feed selection at the very top of the page, and the content (individual feed items) displayed on the rest of the page. After creating the AngularJS module, I created two AngularJS controllers: FeedController to manage the feed source selections at the top, and ContentController to manage populating the feed items into the content area. I created an Angular service that simply provides a data object containing an array of all feed sources including the title, URL, whether the feed is enabled or disabled, and the maximum number of feed items to pull. Via this service, the data object is shared between the FeedController and ContentController. When the user clicks on a feed source to toggle its enable/disable attribute, the specific feed in the data object is changed by the FeedController, and that information is provided to the ContentController which immediately acts upon it to show or hide that feed’s items in the content area. In other words, this service serves as a communication conduit between my two controllers.

I created another Angular service called PullFeeds to interface with Google’s Feed API to pull each individual feed’s content items. This object relies upon Angular promises, an entity bundled with Angular to support asynchronous interactions. As our app is sourcing RSS data from other websites, we are at the mercy of the network and server delays. Per the Google Feed API, we can make a call to pull the feed items from a specific feed URL, and we must supply a call-back function that will process the items received from that feed. Without using a promise, when the PullFeeds service is executed, it will return immediately without any feed items because the action via Google’s Feed API is asynchronous and thus not yet completed. However, when using a promise in the service, only when our call-back function receives the data will the promise be fulfilled and data returned from the service as desired.

I also use promises elsewhere in the app (via another custom service) to pull JSON data from my own server in order to insert custom ads and news items inline in the content. Beyond returning data, promises can also return a rejection that indicates the deferred action failed (i.e. a failed promise to deliver). In my case, if the promise to read the data from my server failed (e.g. the server is down, network inaccessible, etc.), our web app can handle it gracefully by receiving the rejected promise and displaying the feed items without ads.

As a final note for this installment, I want to point out that all of the Javascript execution described thus far occurs on the client side. The HTML, CSS, and Javascript are read by the client from my server, and using AngularJS the resulting page is rendered entirely by the client. All of the feed items are pulled by the client. My server is only used to serve the relatively small HTML/CSS/Javascript, and to supply a small JSON object for custom ad and news items. Though seen as a dynamic web app from the client side, from the server’s perspective it’s all static content and thus very little bandwidth is required.