About Tagadmin

Microservice Architecture Explained. Microservice Architecture for enterprise.

Software development architecture has seen tremendous growth in recent years. Microservice Architecture is a much improved version to the modular approach. We will be addressing Microservice architecture as MSA throughout the article.MicroserviceArchitectureMSA has gained popularity only in recent years. Like most other software development concepts, MSA doesn’t have one standard definition. MSA is the process of creating a large number of smaller services that are completely independent of each other. These smaller services communicate with each other to create a complex software system. These services are fully standalone and can be developed, deployed and maintained by separate teams. Each service is developed to perform a specific task and it doesn’t impact other services. This is why MSA helps in scalability. Microservice Architecture for enterprise. If you’re new to MSA it will give you a perception that it is useful for developing smaller and simpler applications. In reality, big enterprises are using MSA to develop full scale applications for their business. Most companies/startups prefer to start their product development using monolith architecture since it’s simple and quick. Once their product develops beyond a certain limit scaling becomes a challenge. Since all the modules are tightly coupled in monolith architecture, scaling requires redeployment of all the modules. This is where MSA is a game changer. When Enterprise uses MSA for software development and wants to scale, then
  • No need to rewrite the whole application, just rewrite the individual service which needs to be scaled.
  • Since each service performs a small task, it is easier to code, maintain and scale in MSA.
  • There is no need to wait for approvals from other teams to modify one service. Scaling the application is much faster in MSA.
A number of International enterprises have switched over to MSA just for the sake of scalability. Some of them are Amazon, Spotify, Wallmarts, etc. All 3 of the above have reported significant development in their conversions once they embraced the Microservice Architecture. One possible approach for using MSA in Enterprise development is to create multiple and smaller Full Stack teams each working on individual services. Each and every team will have front-end, back-end, testers, UI designers, etc. Their responsibility is to develop and maintain one single service without impacting other modules. Pros of MSA
  • Teams have complete freedom and can work without any external interference.
  • Each and every module can be written using a different language though it is not highly recommended.
  • Development, deployment and testing are much faster.
  • A lot easier to change, scale and maintain individual services
  • Straightforward approach to error detection and correction
  • Integration with third party apps is made much simpler
  • When the number of services in the system is increased beyond certain limit, it leads to information overload and might become difficult to monitor.
  • If the frequency of calling services increases, it will lead to network latency.
  • In some cases duplication of services might occur
  • Switching your existing application from Monolith to MSA is a time consuming and complicated task.
MSA in near future is most likely to be successful and more widely used.
Continue reading

Docker Explained. Pros and Cons of Docker for modern application development

Docker is a tool that is designed to create, deploy and run applications easily by using containers. The containers allow a developer to wrap up an application with all essential parts i.e. libraries, dependencies and ship it all out as one package. The developer can be assured that the application will run on any Linux machine regardless of any customized settings by doing this. The mechanism is abstracted i.e. hiding the implementation. In a nutshell, it is an extension of Linux Containers (LXC) which is a unique kind of lightweight, application-based virtualization. docker explained Docker is compared to that of virtual machine (VM). The difference lies in the fact that rather creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system, they are running on. It only requires applications to be shipped with things not already running on the host computer. The above significantly boosts performance and reduces the application size. It is an open source which means that anyone can contribute to Docker and extend it to meet their own needs if they need additional features that aren’t available out of the container. Even though it says it supports windows and Mac, Docker is primarily Linux oriented. Docker is mainly designed for developers and system administrators. It serves as a part of many DevOps (i.e. developers and operations). For developers, they can focus on writing programs without worrying about the system, it's running on. The developers are allowed to use programs that are stored in Docker container as a part of their application. For operations staff, it provides flexibility and reduces the number of systems needed because of its lower overhead and small footprint. The repeatable nature of Docker images helps to standardize their production code and configurations. Docker is also used for continuous integration purpose. eBay focuses on incorporating it into their process to standardize deployment across distributed network of servers, running as a single cluster. The application dependencies inside containers are isolated to address the issue of each server that has different software versions and special hardware. The above fact clarifies that the host Operating System does not need to be the same as the container Operating System. The end motive is to have different hardware and software systems running as a single Mesos cluster. Docker also serves as a security of a sandbox. A Docker-based sandbox named as CompileBox was released to run untrusted code and return output without risking the host. Basically, the malicious codes that attempt to destroy the system would be limited to the container. It can be created and destroyed quickly as needed. A number of companies and organizations are coming together to bring Docker to desktop applications. Microsoft is trying to implement Docker in their Azure platform which is a development that could potentially make integration of Linux applications with Microsoft products easier than before. The growth of Docker and Linux containers shows no sign of slowing. New businesses continue to jump onto the bandwagon on a regular basis.
Continue reading

Data encryption at rest - SQL Azure DB

Data Encryption at rest Data encryption is a transformation of data into another form to improve security. The people who have access to secret or decryption key or password can only read. The main purpose of it is to protect the confidentiality of digital data. The SQL server encryption is a process to encrypt connections (i.e. links), data and procedures that are stored in a database. The various areas that are needed to be covered to secure SQL Server are the platform, authentication, objects mainly data and applications that access the system. The various SQL Server encryption options are Transparent Data Encryption (TDE), Column-level Encryption, Encrypting and Decrypting Data with the.NET Framework, Encrypting File Systems, and BitLocker. Transparent Data Encryption (TDE) is the primary encryption option that was made available in SQL Server 2008. It enables us to encrypt the whole database. The backups for databases using TDE are also encrypted and it protects the data at rest. It is easy for implementation as well. It encrypts the data stored in both the database's data file (.mdf) and log file (.ldf) using either Advanced Encryption Standard (AES) or Triple DES (3DES) encryption. Like data compression, TDE database encryption is performed at the page level. Data is encrypted on the disk and is decrypted as it's read into memory. When we perform the encryption at the page level, it enables the encryption process to be completely transparent to the client applications. It doesn't have many limitations on the searching ability or query the data in the encrypted database. In addition, since most database applications are optimized to minimize input/output for performance reasons, the encryption process becomes efficient. The participating systems become encrypted if the database is being used with AlwaysOn Availability Groups, database mirroring, or log shipping. The main point is that TDE encrypts the stored data but doesn't encrypt the communications link between the server and the client applications. If we need to encrypt the data connection between the application and the server, we need to use an SSL connection for clients. Technologies such as database mirroring and AlwaysOn Availability Groups support network transport encryption as endpoint properties. Always Encrypted is a feature that is designed to protect sensitive data, such as credit card numbers or national identification numbers, stored in SQL Server databases. The sensitive data inside client applications can be encrypted by clients and the encryption keys are never revealed to SQL Server. As a result, it provides a separation between those who own the data and those who manage the data. Always Encrypted makes encryption transparent to applications. An Always Encrypted-enabled driver installed on the client computer achieves this by automatic encryption and decryption of sensitive data in the SQL Server client application. The driver encrypts the data in sensitive columns before passing the data to SQL Server, and automatically queries are rewritten so that the semantics of the application are preserved. Similarly, the driver transparently decrypts data, stored in encrypted database columns, contained in query results. Disadvantages of Transparent Data Encryption compared to Always Encrypted: 1. Only protects data at rest - backups and data files are "safe" but data in motion or in memory is vulnerable. 2. Only complete database. 3. All data is encrypted the same way. 4. Requires Enterprise Edition. 5. Data always accessible to a system administrator. The future of encryption and the innovation of the applications providers of the near future will forever influence how enterprises conduct business electronically. It begs the question, “What would we do if all our information were safe?”
Continue reading