In this session, we will start with building a simple express micro-service. We will then create a Docker image for the service using both a Docker file as well as the command line. Lastly we will push our image to Docker Hub. The session will run for 30 minutes with 15 minutes for Q&A.
In this session, we learnt about Docker file, Docker build command, Docker cache and how to optimize the process of building Docker images. We have also demonstrated how to move a docker images between different Docker hosts. The session run for 30 minutes.
via Developing Front-End Microservices With Polymer Web Components And Test-Driven Development (Part 1/5): The First Component | Technology.
In this article series we’ll go through Web Components development in context of microservices. We’ll use Polymer as the library that will help us out. The objective is to create a microservice that will handle full functionality. The service will contain not only back-end API (as is the case with most microservices) but also front-end in form of Web Components. Later on, when the time comes to use the service we’re creating, we’ll simply import Web Components. That is quite a different approach than what you might be used to. We won’t create a Web Application that calls APIs handled by microservices. We’ll import parts of the front-end from microservices. Our objective is to have a microservice that contains everything; from front-end to back-end. Please refer to the Including Front-End Web Components Into Microservices article for more information that lead to this decision.
via Yieldbot Blog | Servers as Microservices with Node.jsYieldbot.
At Yieldbot, we handle billions of requests every month and Node.js is a fundamental part of our platform. We use it to render ads and content, create and serve APIs, serve web applications, log and monitor activity, as well as for reporting and analytics. To craft a more well-oiled digital machine, we spent a year breaking down our monolithic server architecture (primarily built using Python and Node.js) and evolved it into a more flexible, service-oriented architecture. By doing so were able to streamline at scale, significantly increasing our production deployments, decreasing the time it takes to fix bugs, and improving new product initiation times.
In the beginning, the first and likely best decision we made was choosing the Hapi Application and Service Framework as our underlying service architecture. This allowed us to redesign our application into a collection of cooperative, individually testable, microservices. We used Hapi to implement our new application. This ran our legacy application as a plugin within our new application and supported our existing stakeholders while we worked on our new design. It also provided a means to cut out shared service features into their own plugins.
via InfoQ: The APIs.json Discovery Format: Potential Engine in the API Economy.
This series focuses on three key areas of “meta-language” for Web APIs: API Description, API Discovery, and API Profiles. You’ll see articles covering all three of these important trends as well as interviews with some of the key personalities in this fast-moving space.
This InfoQ article is part of the series “Description, Discovery, and Profiles: The Next Level in Web APIs”. You can subscribe to receive notifications via RSS.
In the fast growing world of APIs and microservices, finding just the right API when you are developing a web, or mobile application, or possibly integrating between existing systems, is always a tedious task. Conversely, many API providers struggle with making their valuable API driven resources discoverable and easily accessible to potential consumers. Until 2014 there were just a handful of API directories, where API providers could go to list their APIs, and API consumers could go to find the APIs they needed. While this approach has worked for some time now, it has always been designed exclusively for humans, and not for other applications and systems finding, and making decisions around, the API resources they consume.
via Building Microservices, part 4. Dockerize your Microservices | Callista Enterprise.
If you tried out the earlier blog posts in our blog series regarding building microservices I guess you are tired of starting the microservices manually at the command prompt? Let’s dockerize our microservices, i.e. run them in Docker containers, to get rid of that problem!
I assume that you already heard of Docker and the container revolution? If not, there are tons of introductory material on the subject available on Internet, e.g. Understanding Docker.
This blog post covers the following parts:
- Install and configure Docker
- Build Docker images automatically with Gradle
- Configure the microservices for a Docker environment using Spring profiles
- Securing access to the microservices using HTTPS
- Starting and managing you Docker containers using Docker Compose
- Test the dockerized microservices
- Happy days
- Scale up a microservice
- Handle problems
- Next Step