Tag Archives: technology

There’s a Fly in the Milk (and a Bug in the Software)

Where “software bugs” got their name — the dead moth stuck in a relay in Harvard’s Mark II in 1947. From https://en.wikipedia.org/wiki/Software_bug

As one does, I spent a good part of this weekend reading the Annual Report of the Michigan Dairymen’s Association. It provides an interesting glimpse into the processes that have to be managed to source raw materials from suppliers, to produce milk and cream and butter, and to cultivate an engaged and productive workforce.

You might be yelling at your screen right now. DairyMEN’s? Aren’t we beyond that now? What’s wrong with them? The answer is: nothing. This is an annual report from 1915. Your next question is probably what could the dairymen be doing in 1915 that would possibly be interesting for production and operations managers in 2019?  The answer here, surprisingly, is a lot. Except for the overly formal and old-timey word choices, the challenges and concerns encountered in the dairy industry are remarkably consistent over time.

It turns out that flies were a particular concern in 1915 — and they remain a huge threat to quality and safety in food and beverage production today:

  • “…an endless war should be waged against the fly.”
  • “[avoid] the undue exposure of the milk cooler to dust and flies.”
  • “The same cows that freshen in July and August will give more milk in December it seems to me… because at that time of year the dairyman has flies to contend with…”
  • “Flies are known to be great carriers of bacteria, and coming from these feeding places to the creamery may carry thousands of undesirable bacteria direct to the milk-cans or vats.”

In a December 2018 column in Food Safety Tech, Chelle Hartzer describes not one but three (!!!) different types of flies that can wreak havoc in a food production facility. There are house flies that deposit pathogens and contaminants on every surface they land, moth flies that grow in the film inside drains until they start flying too, and fruit flies that can directly contaminate food. All flies need food, making your food or beverage processing facility a potential utopia for them.

In the controls she presented to manage fly-related hazards, I noticed parallels to controls for preventing and catching bugs in software:

  • Make sanitation a priority. Clean up messes, take out the trash on a daily basis, and clean the insides of trash bins. In software development, don’t leave your messes to other people — or your future self!  Bake time into your development schedule to refactor on a regular basis. And remember to maintain your test tools! If you’re doing test-driven development with old tools, even your trash bins may be harboring additional risks.
  • Swap outdoor lighting. In food production facilities, it’s important to use lighting that doesn’t bring the flies to you (particularly at night). Similarly, in software, examine your environment to make sure there are no “bug attractors” like lack of communication or effective version control, dependencies on buggy packages or third party tools, or lack of structured and systematic development processes.
  • Install automatic doors to limit the amount of time and space available for flies to get in to the facility. In software, this relates to limiting the complexity of your code and strategically implementing testing, e.g. test-driven development or an emphasis on hardening the most critical and/or frequently used parts of your system.
  • Inspect loading and unloading areas and seal cracks and crevices. Keep tight seals around critical areas. The “tight seals” in software development are the structured, systematic processes related to verifying and validating your code. This includes design reviews, pair programming, sign-offs, integration and regression testing, and user acceptable testing.
  • Clean drains on a regular basis. The message here is that flies can start their lives in any location that’s suitable for their growth, and you should look for those places and keep them sanitized too. In software, this suggests an ongoing examination of technical debt. Where are the drains that could harbor new issues? Find them, monitor them, and manage them.

Although clearly there’s a huge difference between pest management in food and beverage production and managing code quality, process-related pests have been an issue for at least a century — and likely throughout history. What are the flies in your industry, and what are you doing to make sure they don’t contaminate your systems and bring you down?

Quality 4.0: Let’s Get Digital

Want to find out what Quality 4.0 really is — and start realizing the benefits for your organization? If so, check out the October 2018 issue of ASQ’s Quality Progress, where my new article (“Let’s Get Digital“) does just that.

Quality 4.0 asks how we can leverage connected, intelligent, automated (C-I-A) technologies to increase efficiency, effectiveness, and satisfaction: “As connected, intelligent and automated systems are more widely adopted, we can once again expect a renaissance in quality tools and methods.” In addition, we’re working to bring this to the forefront of quality management and quality engineering practice at Intelex.

Quality 4.0 Evolution

The progression can be summarized through four themes. We’re in the “quality as discovery” stage today:

  • Quality as inspection: In the early days, quality assurance relied on inspecting bad quality out of items produced. Walter A. Shewhart’s methods for statistical process control (SPC) helped operators determine whether variation was due to random or special causes.
  • Quality as design: Next, more holistic methods emerged for designing quality in to processes. The goal is to prevent quality problems before they occur. These movements were inspired by W. Edwards Deming’s push to cease dependence on inspection, and Juran’s Quality by Design.
  • Quality as empowerment: By the 1990’s, organizations adopting TQM and Six Sigma advocated a holistic approach to quality. Quality is everyone’s responsibility and empowered individuals contribute to continuous improvement.
  • Quality as discovery: Because of emerging technologies, we’re at a new frontier. In an adaptive, intelligent environment, quality depends on how:
    • quickly we can discover and aggregate new data sources,
    • effectively we can discover root causes and
    • how well we can discover new insights about ourselves, our products and our organizations.”

Read more at http://asq.org/quality-progress/2018/10/basic-quality/lets-get-digital.html  or download the PDF (http://asq.org/quality-progress/2018/10/basic-quality/lets-get-digital.pdf)

Perception of Value & Today’s Cryptocurrency “Crash”

Artist’s rendering of Bitcoin. THERE ARE NO ACTUAL COINS THAT LOOK LIKE THIS. Don’t ever let anyone sell you one.

Today, many cryptocurrencies lost ~35-50% of their value. Reddit even posted contact information for the National Suicide Prevention Hotline in /r/cryptocurrency, knowing how emotional investors were bound to be today. Bitcoin, which was nearly $20K in mid-December and has been hovering near $14K this past week, dropped nearly $4K and almost sunk below the $10K milestone. I usually track the price of Bitcoin at http://bitcointicker.co, which can show the posted prices from several exchanges (web locations where people go to buy and sell, like Ebay). There are hundreds of cryptocurrencies and many of them dropped in value today.

Why did the prices drop so much on Tuesday? Here are some likely influences:

Market prices are usually driven by supply and demand — for example, if there aren’t that many lobsters available in a particular area at a particular time, and you go to a restaurant hoping to order one — you’ll pay a premium. But that price is also influenced by the quality of the product, the image of the product, which influences your perception of its value. Quality reflects how well something satisfies stated and implied needs or expectations.

Value, however, is quality relative to price, and influenced by image. And people are not always rational: they’ll pay a premium for image, even if the value of a product isn’t particularly high. Just think of all the Macs on display at schools, coffee shops, and airports. Price is related to value… usually, price goes up as value goes up.

Where’s the value of cryptocurrency? A Bitcoin does not, on its own, have any inherent value — just like a dollar or a Euro (a “fiat currency”). But the prospect of an asset that will increase in perceived value — where you can buy low, hold (sometimes just for a few days), and sell high because there are lots of people willing to buy it from you — will have perceived value. Hundreds of early adopters — or “Bitcoin millionaires” — are getting people excited about the prospect of making small investments and reaping huge rewards. That this has happened so recently lends a mystique to ownership of cryptocurrencies and Altcoins (or “alternatives to Bitcoin,” like Ether) in addition to the novelty.

Value is attributed to things by people, and cryptocurrencies are no exception. The quality of the currency itself, and the technical solidity of the platform upon which one is based, isn’t really tied to the cryptocurrency price right now — although this will probably change as knowledge and awareness increases.

Is this the end of Bitcoin? That’s doubtful — there are too many innovators who insist on exploring the technological landscape of cryptocurrencies and blockchain technology, and lots of investors willing to fund them. In the meantime, there are unlikely benefits: because cryptocurrencies are not yet mainstream, a “crypto crash” is not as likely to ripple through the whole economy (no pun intended) like the subprime mortgage crisis of 2008. But if you do decide to buy cryptocurrency, don’t invest any more than you can afford to lose.

The Value of Defining Context

Image Credit: Doug Buckley of http://hyperactive.to

Image Credit: Doug Buckley of http://hyperactive.to

The most important stage of problem-solving in organizations is often one of the earliest: getting everyone on the same page by defining the concepts, processes, and desired outcomes that are central to understanding the problem and formulating a solution. (“Everyone” can be the individuals on a project team, or the individuals that contribute actions to a process, or both.) Too often, we assume that the others around us see and experience the world the same way we do. In many cases, our assessments are not too far apart, which is how most people can get away with making this assumption on a regular basis.

In fact, some people experience things so differently that they don’t even “picture” anything in their minds. Can you believe it?

I first realized this divergence in the work context a few years ago, when a colleague and I were advising a project at a local social services office. We asked our students to document the process that was being used to process claims. There were nearly ten people who were part of this claims-processing activity, and our students interviewed all of them, discovering that each person had a remarkably different idea about the process that they were all engaged in! No wonder the claims processing time was nearly two months long.

We helped them all — literally — get onto the same page, and once they all had the same mental map of the process, time-in-system for each claim dropped to 10 days. (This led us to the quantum-esque conclusion that there is no process until it is observed.)

Today, I read about how mathematician Keith Devlin revolutionized the process of intelligence gathering after 9/11 using this same approach… by going back to one of the first principles he learned in his academic training:

So what had I done? Nothing really — from my perspective. My task was to find a way of analyzing how context influences data analysis and reasoning in highly complex domains involving military, political, and social contexts. I took the oh-so-obvious (to me) first step. I need to write down as precise a mathematical definition as possible of what a context is. It took me a couple of days…I can’t say I was totally satisfied with it…but it was the best I could do, and it did at least give me a firm base on which to start to develop some rudimentary mathematical ideas.

The fairly large group of really smart academics, defense contractors, and senior DoD personnel spent the entire hour of my allotted time discussing that one definition. The discussion brought out that all the different experts had a different conception of what a context is — a recipe for disaster.

What I had given them was, first, I asked the question “What is a context?” Since each person in the room besides me had a good working concept of context — different ones, as I just noted — they never thought to write down a formal definition. It was not part of what they did. And second, by presenting them with a formal definition, I gave them a common reference point from which they could compare and contrast their own notions. There we had the beginnings of disaster avoidance.

Getting people to very precisely understand the definitions, concepts, processes, and desired outcomes that are central to a problem might take some time and effort, but it is always extremely valuable.

When you face a situation like this in mathematics, you spend a lot of time going back to the basics. You ask questions like, “What do these words mean in this context?” and, “What obvious attempts have already been ruled out, and why?” More deeply, you’d ask, “Why are these particular open questions important?” and, “Where do they see this line of inquiry leading?”

(You can read the full article about Devlin, and more important lessons from mathematical thinking, Here.)

View at Medium.com

Where is Quality Management Headed?

Image Credit: Doug Buckley of http://hyperactive.to

Image Credit: Doug Buckley of http://hyperactive.to

[This post is in response to ASQ’s February topic for the Influential Voices group, which asks: Where do you plan to take your career in 2016? What’s your view of careers in quality today—what challenges is this field facing? How can someone starting out in quality succeed?]

We are about to experience a paradigm shift in production, operations, and service: a shift that will have direct consequences on the principles and practice of design, development, and quality management. This “fourth industrial revolution” of cyber-physical systems will require more people in the workforce to understand quality principles associated with co-creation of value, and to develop novel business models. New technical skills will become critical for a greater segment of workers, including embedded software, artificial intelligence, data science, analytics, Big Data (and data quality), and even systems integration. 

Over the past 20 years, we moved many aspects of our work and our lives online. And in the next 20 years, the boundaries between the physical world and the online world will blur — to a point where the distinction may become unnecessary.

Here is a vignette to illustrate the kinds of changes we can anticipate. Imagine the next generation FitBit, the personalized exercise assistant that keeps track of the number of steps you walk each day. As early as 2020, this device will not only automatically track your exercise patterns, but will also automatically integrate that information with your personal health records. Because diet strategies have recently been shown to be predominantly unfounded, and now researchers like Kevin Hall, Eran Elinav, and Eran Siegal know that the only truly effective diets are the ones that are customized to your body’s nutritional preferences [1], your FitBit and your health records will be able to talk to your food manager application to design the perfect diet for you (given your targets and objectives). Furthermore, to make it easy for you, your applications will also autonomously communicate with your refrigerator and pantry (to monitor how much food you have available), your local grocery store, and your calendar app so that food deliveries will show up when and only when you need to be restocked. You’re amazed that you’re spending less on food, less of it is going to waste, and you never have to wonder what you’re going to make for dinner. Your local grocery store is also greatly rewarded, not only for your loyalty, but because it can anticipate the demand from you and everyone else in your community – and create specials, promotions, and service strategies that are targeted to your needs (rather than just what the store guesses you need).

Although parts of this example may seem futuristic, the technologies are already in place. What is missing is our ability to link the technologies together using development processes that are effective and efficient – and in particular, coordinating and engaging the people  who will help make it happen. This is a job for quality managers and others who study production and operations management

As the Internet of Things (IoT) and pervasive information become commonplace, the fundamental nature and character of how quality management principles are applied in practice will be forced to change. As Eric Schmidt, former Chairman of Google, explains:  “the new age of artificial intelligence is beginning, and it’s a big deal.” [2] Here are some ways that this shift will impact researchers and practitioners interested in quality:

  • Strategic deployment of IoT technologies will help us simultaneously improve our use of enterprise assets, reduce waste, promote sustainability, and coordinate people and machines to more effectively meet strategic goals and operational targets.
  • Smart materials, embedded in our production and service ecosystems, will change our views of objects from inert and passive to embedded and engaged. For example, MIT has developed a “smart band-aid” that communicates with a wound, provides visual indicators of the healing process, and delivers medication as needed. [3] Software developers will need to know how to make this communication seamless and reliable in a variety of operations contexts.
  • Our technologies will be able to proactively anticipate the Voice of the Customer, enabling us to meet not only their stated and implied needs, but also their emergent needs and hard-to-express desires. Similarly, will the nature of customer satisfaction change as IoT becomes more pervasive?
  • Cloud and IoT-driven Analytics will make more information available for powerful decision-making (e.g. real-time weather analytics), but comes with its own set of challenges: how to find the data, how to assess data quality, and how to select and store data with likely future value to decision makers. This will be particularly challenging since analytics has not been a historical focus among quality managers. [4]
  • Smart, demand-driven supply chains (and supply networks) will leverage Big Data, and engage in automated planning, automatic adjustment to changing conditions or supply chain disruptions like war or extreme weather events, and self-regulation.
  • Smart manufacturing systems will implement real time communication between people, machines, materials, factories and warehouses, supply chain partners, and logistics partners using cloud computing. Production systems will adapt to demand as well as environmental factors, like the availability of resources and components. Sustainability will be a required core capability of all organizations that produce goods.
  • Cognitive manufacturing will implement manufacturing and service systems capable of perception, judgment, and improving quality autonomously – without the delays associated with human decision-making or the detection of issues.
  • Cybersecurity will be recognized as a critical component of all of the above. For most (if not all) of these next generation products and production systems, quality will not be possible without addressing information security.
  • The nature of quality assurance will also change, since products will continue to learn (and not necessarily meet their own quality requirements) after purchase or acquisition, until the consumer has used them for a while. In a December 2015 article I wrote for Software Quality Professional, I ask “How long is the learning process for this technology, and have [product engineers] designed test cases to accommodate that process after the product has been released? The testing process cannot find closure until the end of the ‘burn-in’ period when systems have fully learned about their surroundings.” [5]
  • We will need new theories for software quality practice in an era where embedded artificial intelligence and technological panpsychism (autonomous objects with awareness, perception, and judgment) are the norm.

How do we design quality into a broad, adaptive, dynamically evolving ecosystem of people, materials, objects, and processes? This is the extraordinarily complex and multifaceted question that we, as a community of academics and practitioners, must together address.

Just starting out in quality? My advice is to get a technical degree (science, math, or engineering) which will provide you with a solid foundation for understanding the new modes of production that are on the horizon. Industrial engineering, operations research, industrial design, and mechanical engineering are great fits for someone who wants a career in quality, as are statistics, data science, manufacturing engineering, and telecommunications. Cybersecurity and intelligence will become increasingly more central to quality management, so these are also good directions to take. Or, consider applying for an interdisciplinary program like JMU’s Integrated Science and Technology where I teach. We’re developing a new 21-credit sector right now where you can study EVERYTHING in the list above! Also, certifications are a plus, but in addition to completing training programs be sure to get formally certified by a professional organization to make sure that your credentials are widely recognized (e.g. through ASQ and ATMAE).

 

References

[1] http://www.huffingtonpost.com/entry/no-one-size-fits-all-diet-plan_564d605de4b00b7997f94272
[2] https://www.washingtonpost.com/news/innovations/wp/2015/09/15/what-eric-schmidt-gets-right-and-wrong-about-the-future-of-artificial-intelligence/
[3] http://news.mit.edu/2015/stretchable-hydrogel-electronics-1207
[4] Evans, J. R. (2015). Modern Analytics and the Future of Quality and Performance Excellence. The Quality Management Journal22(4), 6.
[5] Radziwill, N. M., Benton, M. C., Boadu, K., & Perdomo, W., 2015: A Case-Based Look at Integrating Social Context into Software Quality. Software Quality Professional, December.

Free Speech in the Internet of Things (IoT)

Image Credit: from "Reclaim Democracy" at http://reclaimdemocracy.org/who-are-citizens-united/

IF YOUR TOASTER COULD TALK, IT WOULD HAVE THE RIGHT TO FREE SPEECH. Image Credit: from “Reclaim Democracy” at http://reclaimdemocracy.org/who-are-citizens-united/

By the end of 2016, Gartner estimates that over 6.4 BILLION “things” will be connected to one another in the nascent Internet of Things (IoT). As innovation yields new products, services, and capabilities that leverage this ecosystem, we will need new conceptual models to ensure quality and support continuous improvement in this environment.

I wasn’t thinking about quality or IoT this morning… but instead, was trying to understand why so many people on Twitter and Facebook are linking Justice Scalia’s recent death to Citizens United. (I’d heard of Citizens United, but quite frankly, thought it was a soccer team. Embarrassing, I know.) I was surprised to find out that instead, Citizens United is a conservative U.S. political organization best known for its role in the 2010 Supreme Court Case Citizens United v. FEC.

That case removed many restrictions on political spending. With the “super-rich donating more than ever before to individual campaigns plus the ‘enormous’ chasm in wealth has given the super-rich the power to steer the economic and political direction of the United States and undermine its democracy.” Interesting, sure… but what’s more interesting to me is that the Citizens United case, according to this source

  • Strengthened First Amendment protection for corporations, 
  • Affirmed that Money = Speech, and
  • Affirmed that Non-Persons have the right to free speech.

The article goes on to state that “if your underpants could talk, they would be protected by free speech.”

Not too long ago, a statement like this would just be silly. But today, with immersive IoT looming, this isn’t too far-fetched. 

  • What will the world look (and feel) like when everything you interact with has a “voice”?
  • How will the “Voice of the Customer” be heard when all of that customer’s stuff ALSO has a voice?
  • What IS the “Voice of the Customer” in a world like this?

Deploying Your Very Own Shiny Server

Nicole has been having a lot of fun the last few days creating her own Shiny apps. We work in the same space, and let’s just say her enthusiasm is very contagious. While she focused on deploying R-based web apps on ShinyApps.io, I’m more of a web development geek, so I put my energy towards setting up a server where she could host her apps. This should come in handy, since she blew through all of her free server time on ShinyApps after just a couple of days!

Before you begin, you can see a working example of this at https://shinyisat.net/sample-apps/sampdistclt/.

In this tutorial, I’m going to walk you through the process of:

  1. Setting up an Ubuntu 14.04 + NGINX server at DigitalOcean
  2. Installing and configuring R
  3. Installing and configuring Shiny and the open-source edition of Shiny Server
  4. Installing a free SSL certificate from Let’s Encrypt
  5. Securing the Shiny Server using the SSL cert and reverse proxy through NGINX
  6. Setting appropriate permissions on the files to be served
  7. Creating and launching the app Nicole created in her recent post

Setting Up an Ubuntu 14.04 Server at DigitalOcean

DigitalOcean is my new favorite web host. (Click this link to get a $10 credit when you sign up!) They specialize in high-performance, low-cost, VPS (virtual private servers) targeted at developers. If you want full control over your server, you can’t beat their $5/month offering. They also provide excellent documentation. In order to set up your server, you should start by following these tutorials:

  1. How to Create Your First DigitalOcean Droplet Virtual Server
  2. How to Connect to Your Droplet with SSH
  3. Initial Server Setup with Ubuntu 14.04
  4. Additional Recommended Steps for New Ubuntu 14.04 Servers
  5. How To Protect SSH with Fail2Ban on Ubuntu 14.04

I followed these pretty much exactly without any difficulties. I did make a few changes to their procedure, which I’ll describe next.

Allowing HTTPS with UFW

I found that the instructions for setting up ufw needed a tweak. Since HTTPS traffic uses port 443 on the server, I thought that sudo ufw allow 443/tcp should take care of letting HTTPS traffic through the firewall. Unfortunately, it doesn’t. In addition you should run the following:


$ sudo ufw allow https

$ sudo ufw enable

Your web server may not accept incoming HTTPS traffic if you do not do this. Note: you may not have noticed, but you also installed NGINX as part of the UFW tutorial.

Setting up Automatic Updates on Your Server

The default install of Ubuntu at DigitalOcean comes with the automatic updates package already installed. This means your server will get security packages and upgrades without you having to do it manually. However, this package needs to be configured. First, edit /etc/apt/apt.conf.d/50unattended-upgrades to look like this:

Unattended-Upgrade::Allowed-Origins {
   "${distro_id}:${distro_codename}-security";
   "${distro_id}:${distro_codename}-updates";
};
Unattended-Upgrade::Mail "admin@mydomain.com";
Unattended-Upgrade::Remove-Unused-Dependencies "true";
Unattended-Upgrade::Automatic-Reboot "true";
Unattended-Upgrade::Automatic-Reboot-Time "02:00";

Note, that this configuration will install upgrades and security updates, and will automatically reboot your server, if necessary, at 2:00AM, and it will purge unused packages from your system completely. Some people don’t like to have that much stuff happen automatically without supervision. Also, my /etc/apt/apt.conf.d/10periodic file looks like:

APT::Periodic::Update-Package-Lists "1";
APT::Periodic::Download-Upgradeable-Packages "1";
APT::Periodic::AutocleanInterval "7";
APT::Periodic::Unattended-Upgrade "1";

Which sets upgrades to happen daily, and purges to happen once a week.

Installing and Configuring R

Okay, now that your server is set up (you should be able to view the default NGINX page at http://your-domain-name.com), it’s time to install R.

Set the CRAN Repository in Ubuntu’s sources.list

The first step is to add your favorite CRAN repository to Ubuntu’s sources list. This will ensure that you get the latest version of R when you install it. To open and edit the sources list, type the following:


$ sudo nano /etc/apt/sources.list

Move the cursor down to the bottom of this file using the arrow keys, and add the following line at the bottom:


deb https://cran.cnr.berkeley.edu/bin/linux/ubuntu trusty/

Of course, you can substitute your favorite CRAN repo here. I like Berkeley. Don’t miss that there is a space between “ubuntu” and “trusty”. Hit CTRL+x to exit from this file. Say “yes” when they ask if you want to save your changes. The official docs on installing R packages on Ubuntu also recommend that you activate the backports repositories as well, but I found that this was already done on my DigitalOcean install.

Add the Public Key for the Ubuntu R Package

In order for Ubuntu to be able to recognize, and therefore trust, download, and install the R packages from the CRAN repo, we need to install the public key. This can be done with the following command:


$ sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 51716619E084DAB9

Install R

Run the following:


$ sudo apt-get update

$ sudo apt-get install r-base

When this is finished, you should be able to type R –version and get back the following message:


$ R --version

R version 3.2.2 (2015-08-14) -- "Fire Safety"
Copyright (C) 2015 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under the terms of the
GNU General Public License versions 2 or 3.
For more information about these matters see
http://www.gnu.org/licenses/.

If you get this, you’ll know that R was successfully installed on your server. If not, you’ll need to do some troubleshooting.

Configure R to Use curl and Your CRAN Repository of Choice

Type the following to open up the Rprofile.site file:


$ sudo pico /etc/R/Rprofile.site

You may delete all of the content and add the following:


options(download.file.method="libcurl")

local({
    r <- getOption("repos")
    r["CRAN"] <- "https://cran.rstudio.com/"
    options(repos=r)
})

This will allow us to run install.packages('packagename') without specifying the repository later.

Install Dependencies and Packages Needed by Shiny Server

We’re going to need the devtools package, which means we need to install the libraries upon which it depends first (libcurl and libxml2):


$ sudo apt-get -y build-dep libcurl4-gnutls-dev

$ sudo apt-get -y install libcurl4-gnutls-dev

$ sudo apt-get -y build-dep libxml2-dev

$ sudo apt-get -y install libxml2-dev

Now we can install devtools, rsconnect, and rmarkdown:


$ sudo su - -c "R -e \"install.packages('devtools')\""

$ sudo su - -c "R -e \"devtools::install_github('rstudio/rsconnect')\""

$ sudo su - -c "R -e \"install.packages('rmarkdown')\""

$ sudo su - -c "R -e \"install.packages('shiny')\""

Install Shiny Server

Okay! Now we’re finally ready to install Shiny Server. Run the following:


$ cd ~ 
$ sudo apt-get install gdebi-core
$ wget https://download3.rstudio.org/ubuntu-12.04/x86_64/shiny-server-1.4.1.759-amd64.deb
$ sudo gdebi shiny-server-1.4.1.759-amd64.deb

At this point, your Shiny Server should be up and running, but we can’t visit it on the web yet because by default, it runs on port 3838, which is blocked by the firewall we set up earlier. We’re now going to secure it, and use a reverse proxy to run it through NGINX.

Install an SSL Certificate with Let’s Encrypt

Let’s Encrypt is a new, free service that will allow you to install a trusted SSL certificate on your server. Since Google and Mozilla are working hard to phase out all non-HTTPS traffic on the web, it’s a good idea to get into the habit of installing SSL certs whenever you set up a new website. First install git, then use it to download letsencrypt:


$ sudo apt-get install git
$ git clone https://github.com/letsencrypt/letsencrypt
$ cd letsencrypt

Now before we install the certificate, we have to stop our web server (NGINX). In the code below, replace yourdomain.com with your actual domain name that you registered for this site.


$ sudo service nginx stop
$ sudo ./letsencrypt-auto certonly --standalone -d yourdomain.com -d www.yourdomain.com

If all goes well, it should have installed your new certificates in the /etc/letsencrypt/live/yourdomain.com folder.

Configure the Reverse Proxy on NGINX

Open up the following file for editing:


$ sudo nano /etc/nginx/nginx.conf

And add the following lines near the bottom of the main http block, just before the section labeled “Virtual Host Configs”. In my file, this started around line 62:


...

##
# Map proxy settings for RStudio
##
map $http_upgrade $connection_upgrade {
    default upgrade;
    '' close;
}

##
# Virtual Host Configs
##
...

And then open up the default site config file:


$ sudo nano /etc/nginx/sites-available/default

And replace its contents with the following. Note you should replace yourdomain.com with your actual domain name, and 123.123.123.123 with the actual IP address of your server.


server {
   listen 80 default_server;
   listen [::]:80 default_server ipv6only=on;
   server_name yourdomain.com www.yourdomain.com;
   return 301 https://$server_name$request_uri;
}
server {
   listen 443 ssl;
   server_name yourdomain.com www.yourdomain.com;
   ssl_certificate /etc/letsencrypt/live/yourdomain.com/fullchain.pem;
   ssl_certificate_key /etc/letsencrypt/live/yourdomain.com/privkey.pem;
   ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
   ssl_prefer_server_ciphers on;
   ssl_ciphers AES256+EECDH:AES256+EDH:!aNULL;

   location / {
       proxy_pass http://123.123.123.123:3838;
       proxy_redirect http://123.123.123.123:3838/ https://$host/;
       proxy_http_version 1.1;
       proxy_set_header Upgrade $http_upgrade;
       proxy_set_header Connection $connection_upgrade;
       proxy_read_timeout 20d;
   }
}

Now start NGINX up again:


$ sudo service nginx start

And if all went well, your new Shiny Server should be up and running at https://yourdomain.com!

Note that even if you try to go to the insecure URL, traffic will be automatically redirected through HTTPS.

Setting Appropriate Permissions

Sometimes, your Shiny apps will need access to the filesystem to read or write files. Since the Shiny server runs as the user shiny, and since all the files that are being served are owned by root, then your apps will crash when they try to access files. I like Dean Attali’s solution. Run the following commands, substituting yourusername with the username you are using to access the server:


$ sudo groupadd shiny-apps
$ sudo usermod -aG shiny-apps yourusername
$ sudo usermod -aG shiny-apps shiny
$ cd /srv/shiny-server
$ sudo chown -R yourusername:shiny-apps .
$ sudo chmod g+w .
$ sudo chmod g+s .

In the future, any time you add files under /srv/shiny-server, you may need to change the permissions so the Shiny server can read them. We’ll do that in a moment.

Installing a New App

Finally, I’m going to show you how to put a new app on the server. We’re going to use the app that Nicole created and add it into the “sample apps” folder. Run the following:


$ cd /srv/shiny-server/sample-apps
$ mkdir sampdistclt
$ cd sampdistclt
$ nano server.R

This will create a new file called server.R and open it for editing. Copy and paste the second half of the code from Nicole’s post (the part that starts with ## server) into this file. Save and exit. Now create a second file in this directory called ui.R and paste the code from the first half of Nicole’s post (the part that starts with ## ui up to but not including the part that starts ## server). Save and exit.

Now you need to make sure that the permissions are set correctly:


$ chown -R :shiny-apps .

You may also need to restart the Shiny and/or NGINX servers. The commands to do that are:


$ sudo service nginx restart
$ sudo service shiny-server restart

If all has gone well, you can now view the app up and running at https://yourdomain.com/sample-apps/sampdistclt!

Conclusion

I haven’t had a lot of time to use this configuration, so please let me know if you find any bugs, or things that need to be tweaked. On the plus side, this configuration may be cheaper than using ShinyApps.io, but it also doesn’t have all the cool bells and whistles that you get there, either, like their user interface and monitoring traffic. At the very least, it should be a way to experiment, and put things out in the public for others to play with. Enjoy!

« Older Entries