Pages with tag Docker

A simple multi-tier Node.js and Nginx deployment using Docker In a world of Microservices, our Docker deployments are probably multiple small containers each implementing several services, with an NGINX container interfacing with the web browsers. There are many reasons for doing this, but lets instead focus on the how do we set up NGINX and NodeJS in Docker question. It is relatively easy, and to explore this we'll set up a two-container system with an NGINX container and a simple NodeJS app standing in as a back-end service.
Avoid 'could not be accessed' error when deploying a Service to a Docker Swarm on AWS Launching a Docker Swarm on EC2 instances is relatively easy, but of course there are pitfalls. One involves deploying a service to the swarm, but getting an error message about being unable to access the container image from the ECR registry.
Connect with SSL to MySQL in Docker container

MySQL throws an error if you connect without using SSL, so therefore the MySQL team is making it clear it's best to use SSL. Clearly a database connection has critical data that you don't want to leak to 3rd parties, and encrypting the database connection is preferred. What's even more preferred is tight control to limit visibility of the database connection. The official MySQL Docker container automatically generates a set of SSL certificates to use for connections, so let's see how to put those certificates to use.

Correctly launch MySQL on Docker for Windows, avoiding 'Bind on unix socket' error Docker is an excellent tool for launching Linux-containerized applications, and it even runs on Windows. But running Docker containers on Windows has a few unexpected rough edges. One will come if you try the default way to launch MySQL doesn't work on Windows. Instead of the expected successful launch you might instead be told Can't start server : Bind on unix socket and asked whether there is another MySQL server running. That misleading error can send you on a tangential wild goose chase.
Creating a Docker Swarm using Multipass and Ubuntu 20.04 on your laptop Docker is a cool system for deploying applications as reusable containers, and Docker Swarm is a Docker Orchestrator that let's us scale the number of containers across multiple machines. Multipass is a very light weight virtual machine manager application running on Windows, Linux and macOS, that let's us easily set up multiple Ubuntu instances on our laptop at low performance impact. Therefore Multipass can serve as a means to easily experiment with Docker Swarm on your laptop, learning how it works, setting up networks, etc.
Creating a Docker Swarm with Raspberry Pi Zero's for easy cluster computing

Docker is a powerful basis for cloud computing especially if you use Docker Swarm's. This tutorial shows how to autoscale Docker images over a cluster of inexpensive Raspberry Pi Zero computers. It's an interesting way to learn about Docker and using Docker Swarms. The example shows autodeployment of Node-RED instances to individual Pi Zero's which is raises interesting ideas in my mind.

A downside to this example is the laborious setup. Each Pi Zero must have a customized boot SD card created. Seems to me that marrying this idea with the "Cluster HAT" hardware might be easier to manage, since you don't create a customized SD card for each machine in the cluster.

Deploying Docker images to a server without using a Docker Registry

We formerly deployed server applications to a Linux server using manual processes. An advanced team might use shell scripts to automate deployment. Over time tools like Chef or Ansible and more grew to handle ever-more-complex server application deployment scenarios. A few years ago, Docker came onto the scene with a whole new approach involving building a "Container" housing a complete operating system image that runs your application. Having built the Container, it's easy to ship that container to a server or run it on your laptop. The compelling gain is having the exact same development environment on your laptop as is deployed to your servers. Using the EXACT same environment streamlines your work by removing a ton of potentially destabilizing variables.

The preferred method is to build a Docker container image on your laptop, or on a build server, and upload the image to a Docker Registry. The image can then be downloaded from the Registry onto any number of systems.

What if you don't want to, or cannot, use a Registry? You could instead deploy the source code to the server, and build the container image on the server. That's a very unwise move, and it's better to ship the container image to the server. Turns out that is easy to do.

Deploying a simple multi-tier Node.js and Nginx deployment to AWS ECS Amazon's AWS Elastic Container Service (ECS) lets us deploy Docker containers to the AWS cloud. ECS is a very complex beast to tame, but Amazon offers a method of using Docker Compose to describe an ECS Service. That hugely simplifies the task of launching tasks on AWS ECS.
Deploying an Express app with HTTPS support in Docker using Lets Encrypt SSL Getting HTTPS support in all websites is extremely important for a variety of security purposes. HTTPS both encrypts the traffic as it goes over the Internet, and it validates the website is correctly owned. Let's Encrypt is a fabulous service providing free SSL certificates, which are required to implement HTTPS. For this posting we'll be implementing a simple Express app, deploying it using Docker and Docker Compose, using Let's Encrypt to acquire the SSL certificates, and hosting the app on a cloud hosting service.
Easily manage Docker containers on both local and remote Docker hosts with Portainer Docker is a wonderful advancement for software engineers and system administrators. It simplifies launching and maintaining background processes, while adding a layer of much-needed encapsulation and security. But the default command-line administrative tools are less than pleasant to use, and we instead want a good GUI with which to manage our Docker hosts. That's where Portainer comes in. It manages the Docker containers we have running on our local host, or on remote hosts.
Getting started with Docker: Installation, first steps

Let's start this journey into Docker by learning how to install it on popular systems. Installation is a lot simpler today than at the beginning, especially on Mac and Windows. You youngsters don't know how easy you have it now that Docker for Mac and Docker for Windows exist. In the old days we had to walk 10 miles through the snow, uphill both ways, to install VirtualBox along with a specialized virtual machine to use Docker.

HTTPS with nginx, using Lets Encrypt, proxying to Gogs and Jenkins back-end services

Modern development environments require a continuous integration system, along with a reasonable git-based repository hosting service. It's possible to rent these services, Github and Gitlab are both excellent hosted git repository services for example, and there are several hosted continuous integration systems. Gitlab in particular is a one-stop-shop offering both Git hosting and continuous integration in one service. But, you can easily host Git and Continuous Integration services on your own hardware. And with a little work the services can be HTTPS-protected using Lets Encrypt.

Installing Gitea for self-hosted Git service, replacing Gogs While Gogs is an excellent tool to have a self-hosted Git service (like Github), I recently found out the project is semi-abandoned. A group of Gogs users launched Gitea as a replacement, and in any case it looks like a better server. The goal here is to install Gitea, evaluate it, and see how to convert Gogs-based repositories over to Gitea. The result will be managed in the Docker self-hosting machine I have at home.
Learn to use Docker for application development and deployment

Docker is a wonderful tool that abstracts away all kinds of details about configuring and maintaining Linux Containers. The power to simply type "docker run image-name" and have a bunch of complexity automatically handled is great. However Docker is one of those tools with lots of moving parts behind the scenes, and some training is needed to use it well.

Make your own Raspberry Pi git repository server with Gogs and Docker The Raspberry Pi is an amazing little computer that, while it's targeted at the DIY Hardware Maker, it is a full-fledged Linux computer that can be used to run services that used to require much bigger and more expensive computers. How long ago were office servers required to be $4000 systems from the likes of Dell Computers? It seems that the Raspberry Pi (and other tiny computers) can perform the same tasks at a low cost with minuscule energy requirements. To this end I'm setting up Gogs (a github-like server for Git repositories) on a Raspberry Pi. As I worked on the project it seemed most straightforward to use Docker to manage the Gogs process, and therefore the project became setting up Docker on Raspberry Pi to run other services.
Manage Letsenrypt HTTPS/SSL certificates with a Docker container using Cron, Nginx, and Certbot Modern websites must have HTTPS support for security reasons. As a result web browsers and search engines have begun downgrading sites that do not support HTTPS. That means we all must have a simple, low cost, way to set up HTTPS support on our websites. The Letsencrypt project offers free SSL certificates for HTTPS. In this project we will create a Docker container for handling HTTPS via Nginx, and automated SSL certificate renewal using the Letsencrypt command-line tools (Certbot).
Moving Docker's files to a custom location

Docker is a wonderful tool that abstracts away all kinds of details about configuring and maintaining Linux Containers. The power to simply type "docker run image-name" and have a bunch of complexity automatically handled is great. But you may want to change Docker's defaults, and just how do you do so? In my case "/var/lib/docker" would be on an SSD drive, and to lengthen its lifetime I want to minimize the number of writes to that drive. Moving this directory to the SSD should help with that goal.

Remotely control a Docker Engine or Docker Swarm in two easy steps How do we manage a Docker instance on another computer? We can SSH into that instance and run Docker commands there, but a not-well-documented feature built into Docker lets us remotely access, and manage, Docker Swarm clusters or Docker Engine instances. It's fast, easy and very powerful.
Run Linux/X11 apps in Docker and display on a Mac OS X desktop While the MacPorts and Homebrew projects bring many Linux apps to the Mac environment, they don't support every app we'd want to run. Since the X11 environment is not native to macOS it's not a simple recompile, since you have to rewrite the GUI system. Thankfully there is an X11 display server for Mac OS X that can be used to run an application in a Linux environment and display it on the macOS desktop. In this article we'll look at one way to get this all connected up and running.
Scheduling background tasks using cron in a Docker container

Sometimes you want a Docker container to execute background tasks, and therefore want cron to be installed and running. Having cron running in the background is part of normal Unix/Linux/etc system admin practices. Even though the crontab format is kind of hokey, we all learn it and set up automated background tasks to keep the world functioning. Let's see how to set this up in a Docker container.

Set up MongoDB Docker image

MongoDB, as one of the popular NoSQL databases, is part of many software development projects. Hence, one must know how to configure and setup MongoDB in a Docker environment.

Set up MySQL Docker image on your laptop, then verify it works using phpMyAdmin

Setting up MySQL on Docker is fairly simple, and the MySQL team has done a credible job creating a flexible Docker image that can be used in many circumstances. Once the MySQL container is set up, you need a method to verify it can be accessed from other containers, and to manage the database it contains. Enter the mysql-client and phpMyAdmin Docker images. Both are easy to set up, and easy to use. Typically when deployed as part of an application stack, the MySQL container won't be visible to the public Internet but does need to be visible to other containers in your deployed application. Hence, there must be a private bridge network the containers use to communicate with each other, and the only ports published are what's required to supply the service to the public.

Unlike the MAMP product, what's shown here is equally applicable to macOS and Windows laptops.

Solve 'Drive has not been shared' error with Docker on Windows We often mount folders into a Docker container to ensure data is persistent while letting us freely destroy and recreate the container. But on Windows you might get a head-scratching error message saying "Unhandled exception: Drive has not been shared". The most common advice that might come up when searching the Internet is not about this situation, but about general file sharing in Windows. In this case the error refers to configuration settings in Docker.
Terraform deployment of a simple multi-tier Node.js and Nginx deployment to AWS ECS Amazon's AWS Elastic Container Service (ECS) lets us deploy Docker containers to the AWS cloud. ECS is a very complex beast to tame, but Terraform offers a way to easily describe infrastructure builds not only AWS but on many other cloud services. Using Terraform is hugely simpler than any tool offered by AWS ECS.
Terraform deployment on AWS ECS a multi-tier Node.js and Nginx system using Service Discovery Amazon's AWS Elastic Container Service (ECS) lets us deploy Docker containers to the AWS cloud. In earlier postings of this series we have hosted a NGINX/Node.js application stack using Docker on our laptop, and on AWS ECS, then we hosted it on AWS ECS using Terraform. In this tutorial we switch to hosting the application using two AWS ECS Service instances, and use AWS Service Discovery between the instances.
Understanding MySQL the Access Denied error in or outside a Docker container The other day I wasted more than a full workday in which MySQL Access denied for user (using password: YES), and I want to help others avoid this problem. Along the way to fixing the issue, I learned a lot about how MySQL authenticates user ID's. I've been using MySQL for years and had glossed over this topic, but it turns out to not be terribly difficult.
Using Docker to host ARM toolchain to cross-compile C code I'm starting up a project that will see me doing custom software development for an ARM single-board-computer running Linux. The recommendation isn't to do compiles ON the board, but instead to cross compile from a Linux workstation (Debian). But, I use a Mac laptop, as do most software engineers these days. While I could run VirtualBox to set up a Debian cross-compiling environment, Docker is much lighter weight. While Docker was originally targeted for deploying server applications, it is useful for packaging anything. In this case there's a ready-made set of Docker containers for cross-compilation including for ARM CPU's.