TOP MNC — AWS / DevOps INTERVIEW QUESTIONS

·

35 min read

What is GIT stash?

Git stash is a feature in Git that allows you to temporarily save your changes without committing them. It's useful when you need to switch branches or perform some other operation that requires a clean working directory.

What is a branching strategy?

A branching strategy is a set of rules and guidelines for how branches are created, named, and managed in a version control system like Git. It defines how code changes flow from development through testing to production.

What is the command to discard changes in the working directory?

The command to discard changes in the working directory in Git is git checkout -- <file> or git checkout . to discard all changes.

What is the Command to change the ownership and Permission of a file or directory in Linux?

To change ownership: chown [new_owner] [file/directory]

To change permissions: chmod [permissions] [file/directory]

Change ownership to "newuser": chown newuser /var/www/myfile.txt

Change permissions to read and write for the owner: chmod u+rw /var/www/myfile.txt

How are Kubernetes clusters created?

A Kubernetes (K8s) cluster can be created using a variety of tools, including:

  • Minikube

    An open-source tool that can be used to create a simple cluster with one worker node. It's compatible with Linux, Mac, and Windows.

  • Kubeadm

    A tool that can be used to create a cluster that meets best practices and passes Kubernetes Conformance tests. It can also be used for other cluster lifecycle functions, like bootstrap tokens and cluster upgrades.

  • Kubectl

    A command line tool that can be used to communicate with a Kubernetes cluster's control plane.

  • Helm

    A tool that can be used to create and deploy resources

How do you uncommit the changes that have already been pushed to GitHub?

You can revert the commit history by using git revert or git reset:

  • git revert <commit>: This creates a new commit that undoes the changes in a previous commit.

  • git reset --hard <commit>: This will reset the current branch to the specified commit, discarding any commits made after it. You'll need to push the changes with git push --force.

Note: Be cautious when using git reset --hard, especially with git push --force, as it rewrites history and can cause issues for other collaborators.

How do you debug the exited container?

You can debug an exited container by using the docker start -ai <container_id> command, which starts the container interactively and attaches your terminal to it. You can then inspect logs, check configurations, or run debugging commands within the container.

How do you execute jobs in parallel in Jenkins?

Jobs can be executed in parallel in Jenkins by using the "Parallel" step in Jenkins Pipeline or by configuring concurrent builds in the job configuration.

Maven Lifecycle

Maven is a build management tool. It uses a simple pom.xml to configure all the dependencies needed to build, test, and run the code

Maven defines a standard build lifecycle consisting of phases like clean, validate, compile, test, package, install, and deploy, among others. These phases define the order in which goals are executed.

Your team wants a Grafana dashboard to visualize the HTTP request latency of your applications running in EKS. How would you achieve this?

To visualize HTTP request latency in Grafana:

  1. Use Prometheus to scrape metrics from your EKS cluster.

  2. Set up Grafana to connect to Prometheus as a data source.

  3. Create a dashboard in Grafana with a query like:

     prometheusCopy codehttp_request_duration_seconds_bucket{job="nginx", status="2xx"}
    

This will show latency metrics, which can be graphed in Grafana.

How do you upgrade Jenkins?

Jenkins can be upgraded by downloading the latest version from the Jenkins website and following the upgrade instructions provided in the documentation.

What is called a Parameterised Job in Jenkins?

A Parameterised Job in Jenkins is a job that accepts parameters when it is triggered. These parameters can be used to customize the job's behavior or configuration.

What is called Docker Swarm?

Docker Swarm is Docker's native clustering and orchestration solution. It allows users to create and manage a swarm of Docker nodes, effectively turning them into a single, virtual Docker host. Docker Swarm enables the scaling of applications, load balancing, and high availability.

How do you handle codes in Nexus satisfactorily?

Nexus is a repository manager that can be used to store and manage artifacts like JAR files, WAR files, and Docker images. You can handle codes in Nexus by uploading artifacts to the repository and managing them using Nexus's web interface or API.

How do you manage space issues in the Jenkins server?

Space issues in Jenkins can be managed by regularly cleaning up old build artifacts, configuring build retention policies, and monitoring disk usage.

What is called a multibranch project in the Jenkins server?

A multi-branch project in Jenkins is a project type that automatically creates Jenkins jobs for each branch in a repository. It scans the repository for branches and creates jobs for them, allowing you to build and test each branch independently.

How do you secure the Jenkins server?

Jenkins servers can be secured by enabling authentication, configuring authorization, using HTTPS for communication, restricting access to sensitive information, and regularly applying security updates.

How do you manage GitHub roles?

GitHub roles can be managed by assigning appropriate permissions to users or teams within a GitHub organization. This can be done using the GitHub web interface or API.

What is called a NULL resource in Terraform?

In Terraform, a null resource is a placeholder resource that does nothing but can be used to trigger actions or execute provisioners based on changes in other resources.

What is called terraform fmt?

terraform fmt is a command in Terraform used to format Terraform configuration files according to a standard style.

What is called Snowball?

Snowball is a service provided by AWS that allows you to transfer large amounts of data to and from the AWS cloud using physical storage devices.

How do you manage credentials in Terraform?

Credentials in Terraform can be managed using environment variables, CLI flags, or by configuring providers to use IAM roles or access keys stored securely.

What is called Code Deploy in AWS?

AWS CodeDeploy is a service that automates code deployments to EC2 instances, on-premises servers, Lambda functions, and other compute services.

Can you attach a single EBS volume to multiple EC2 instances at the same time?

No, a single EBS volume can only be attached to one EC2 instance at a time.

Can you use Multiple FROM in DockerFile?

No, you can only have one FROM instruction in a Dockerfile. It defines the base image for the subsequent instructions.

DockerFile runs as which user?

By default, commands in a Dockerfile are executed as the root user.

How can we pass an argument to DockerFile?

Arguments can be passed to a Dockerfile using the --build-arg flag when running docker build command.

What are deployment strategies?

Deployment strategies are techniques used to release new versions of software while minimizing downtime and risk. Examples include blue-green deployment, canary deployment, rolling updates, etc.

What is called an application load balancer?

An application load balancer is a type of load balancer provided by AWS that operates at the application layer (Layer 7) of the OSI model, allowing it to route traffic based on content, cookies, or URL patterns.

What is Kubernetes architecture?

Kubernetes architecture consists of a master node that controls the cluster and worker nodes where containers are deployed. It includes components like the API server, scheduler, controller manager, etcd for state management.

What is called Fargate service in AWS?

AWS Fargate is a serverless compute engine for containers that allows you to run containers without managing the underlying infrastructure.

What are Register targets in Ansible?

Register targets in Ansible are variables that store the output of a task or command for later use in the playbook.

How do you pull artifacts from Nexus?

Artifacts can be pulled from Nexus using tools like Maven, Gradle, or Docker, which are configured to resolve dependencies from the Nexus repository.

How to access the S3 bucket privately?

S3 buckets can be accessed privately by configuring bucket policies, IAM policies, or access control lists (ACLs) to restrict access to specific users or roles.

What is the difference between a NAT instance and a NAT Gateway?

A NAT instance is an EC2 instance configured to forward traffic from instances in a private subnet to the internet. A NAT Gateway is a managed service provided by AWS that performs the same function but is highly available and scalable.

How can you restrict particular IPs accessing EC2 instances?

You can restrict access to EC2 instances by configuring security groups or network access control lists (NACLs) to allow traffic only from specific IP addresses or ranges.

What is called VPC peering?

VPC peering is a networking connection between two VPCs

What is called Transit Gateway?

Gateway: Gateway is a network device that connects different networks. For example, an Amazon VPC (Virtual Private Cloud) can be connected to an on-premises network using an AWS VPN Gateway.

A Transit Gateway is a service in AWS that allows you to connect multiple VPCs and on-premises networks together in a hub-and-spoke model.

What are the types of autoscaling?

The types of autoscaling in AWS include:

Horizontal autoscaling: This means increasing the number of instances/systems for your application.

Vertical autoscaling: It Involves increasing the capacity of an individual resource, such as upgrading to a larger instance type in Amazon EC2 to handle more load.

Scheduled autoscaling: Scaling based on predefined schedules.

Predictive autoscaling: Scaling based on predicted load patterns.

To prevent DDOS attacks, which load balancer is used?

To prevent DDoS attacks, AWS offers the AWS Shield service, which protects against DDoS attacks at the network and application layers. Application Load Balancers (ALB) and Network Load Balancers (NLB) can be configured with AWS Shield for additional protection.

What is called a sticky session?

A sticky session, also known as session affinity, is a feature of load balancers that ensures that requests from the same client are always routed to the same backend server. This is useful for applications that store session state on the server side.

What is called Lambda?

AWS Lambda is a serverless compute service through which you can run your code without provisioning any Servers.

It only runs your code when needed and also scales automatically when the request count increases. It supports various programming languages and can be triggered by events from other AWS services or custom sources.

How do you manage the tfstate file in Terraform?

The tfstate file in Terraform contains the state of your infrastructure. It should be stored securely and managed using a remote backend like Amazon S3 or HashiCorp Consul. This ensures that the state is accessible to all members of your team and is not lost if a local copy is destroyed.

How do you create multiple EC2 instances in Terraform?

Multiple EC2 instances can be created in Terraform by defining multiple resource blocks with the aws_instance type, each specifying the desired configuration for an instance.

AWS has released a new service, how does Terraform behave?

Terraform typically releases updates to support new AWS services shortly after they are released. Users can check the Terraform documentation or the Terraform AWS provider release notes for information on when support for new services is added.

How do you uncommit the changes that have already been pushed to GitHub?

To undo commits that have already been pushed to GitHub, you can use git revert to create a new commit that undoes the changes introduced by a previous commit or git reset --hard to remove commits from the current branch entirely. However, caution should be exercised when using git reset it as it rewrites history.

What is the difference between git pull and git fetch?

git pull fetches changes from a remote repository and merges them into the current branch, while git fetch only fetches changes from the remote repository but does not merge them. This allows you to inspect the changes before merging them into your local branch.

What is called Jenkins File?

A Jenkinsfile is a text file that contains the definition of a Jenkins Pipeline. It is written in Groovy syntax and defines the steps to be executed in the pipeline, including building, testing, and deploying software.

What is called Shared Libraries in Jenkins?

Shared Libraries in Jenkins are reusable scripts or Groovy code that can be shared across multiple Jenkins pipelines. They allow you to define common functions, steps, or variables that can be used in various pipelines to promote code reuse and maintainability.

What is called docker networking?

Docker networking refers to the networking capabilities of Docker containers, including creating virtual networks, connecting containers to networks, and exposing ports to allow communication between containers or between containers and the host system.

What is called a Trust relationship in AWS?

A trust relationship in AWS defines which entities are trusted to assume roles within an AWS account. It specifies the trusted entities (such as other AWS accounts or IAM users) and the permissions they are granted when assuming the role.

What are public and private subnets?

The public subnet is a subnet in a VPC that has a route to the Internet, typically through an Internet gateway, allowing resources in the subnet to communicate with the Internet.

A private subnet, on the other hand, does not have a route to the Internet and is typically used for resources that should not be directly accessible from the Internet.

How do you establish a connection between EC2 instances?

You can establish a connection between EC2 instances by using SSH (for Linux instances) or RDP (for Windows instances) to connect directly to the instance over the internet or through a private network if the instances are in the same VPC.

What is realm command?

The realm command is a command-line interface (CLI) tool used to interact with Keycloak, an open-source identity and access management (IAM) solution. It allows you to manage realms, users, roles, and other aspects of the Keycloak server from the command line.

How do you differentiate within an AWS account dev env, test env, and prod env?

You can differentiate between development (dev), testing (test), and production (prod) environments in AWS by using separate AWS accounts, separate VPCs within the same account, or by using tags and naming conventions to distinguish resources belonging to each environment.

Types of EC2 instances?

AWS offers a variety of EC2 instance types optimized for different use cases, including general-purpose, compute-optimized, memory-optimized, storage-optimized, and GPU instances, among others.

How can you encrypt the already created unencrypted EBS without creating a fresh EC2 instance?

You can encrypt an unencrypted EBS volume without creating a fresh EC2 instance by taking a snapshot of the unencrypted volume, creating a new encrypted volume from the snapshot, and attaching it to the instance as a replacement for the unencrypted volume.

How do you install Nginx in the Ansible playbook?

You can install Nginx in an Ansible playbook by using the apt or yum module to install the Nginx package and the service module to ensure that the Nginx service is running.

How do you recover the deleted object in S3?

Deleted objects in S3 can be recovered if versioning is enabled on the bucket. You can restore a deleted object by using the AWS Management Console, AWS CLI, or SDK to initiate a restore operation on the object's version.

How do you route the data only to one EC2 instance when an application load balancer has 5 servers connected?

You can route data to only one EC2 instance behind an Application Load Balancer (ALB) by configuring session stickiness or by using a target group with a single target (the desired instance) and associating it with the ALB listener.

What is called “FROM SCRATCH” in Docker?

FROM SCRATCH is a special instruction in a Dockerfile that indicates the base image for building a new Docker image. It essentially starts from an empty filesystem, without any pre-existing layers.

Can we run the container inside the container?

Yes, it's possible to run a Docker container within another Docker container, though it's generally not recommended due to security and performance concerns. This technique is often used in specific scenarios like testing or debugging.

Can we use Ansible to create infrastructure in AWS?

Yes, Ansible can be used to create and manage infrastructure in AWS. Ansible provides AWS modules that allow you to provision resources such as EC2 instances, VPCs, security groups, and more.

What is called EC2 auto recovery?

EC2 Auto Recovery is a feature provided by AWS that automatically recovers an EC2 instance if it becomes impaired due to an underlying hardware failure. It preserves the instance's instance ID, private IP addresses, Elastic IP addresses, and all instance metadata.

What is called Persistent Storage in Docker?

Persistent storage in Docker refers to storage that persists even after a container is stopped or deleted. This can be achieved by using Docker volumes or bind mounts to mount directories from the host filesystem into the container.

What happens when you delete /var/lib/docker/overlay?

Deleting the /var/lib/docker/overlay directory would likely corrupt Docker's internal state, as it contains important files related to Docker's overlay filesystem, used for managing container filesystems and layers. It's not recommended to manually delete this directory.

What are called regular expressions in Linux?

Regular expressions (regex) in Linux are patterns used for matching strings or text data. They are powerful tools for searching, manipulating, and validating text based on specific patterns or rules.

What is called DynamoDB?

DynamoDB is a fully managed NoSQL database service provided by AWS. It offers seamless scalability, high availability, and low latency for applications requiring fast and predictable performance at any scale.

How do you push the image to DockerHub?

To push a Docker image to DockerHub, you first tag the image using your DockerHub username and repository name, then use the docker push command followed by the tagged image name. For example:

codedocker tag my_image my_username/my_repository
docker push my_username/my_repository

Why do you change the name of the image using the tag command in Docker?

You change the name of the image using the tag command in Docker to associate the image with a specific repository on DockerHub or another Docker registry. This allows you to push the image to the registry and pull it from other hosts.

How do you authorize data to the Application Load Balancer?

Data is authorized to an Application Load Balancer (ALB) by configuring security groups and network ACLs to allow traffic from specific sources to reach the ALB's listener ports. Additionally, you can configure AWS IAM policies to control access to the ALB's APIs and resources.

What is called Event Handler in Lambda?

An event handler in AWS Lambda is a function that is executed in response to an event. It defines the code that Lambda should run when a specific event occurs, such as an HTTP request, S3 upload, DynamoDB update, etc.

What is the difference between CMD and Entrypoint in Docker?

CMD is an instruction in a Dockerfile that specifies the default command to run when a container starts, which can be overridden at runtime. Entrypoint is similar but defines the executable that will run when the container starts, with any specified arguments passed to it as arguments to the container.

What is called CloudFormation?

AWS CloudFormation is a service that allows you to define and provision infrastructure as code, enabling you to create, update, and manage AWS resources in a declarative and automated way using JSON or YAML templates. You can spend less time managing resources and more time focusing on your applications.

How do you change the name of an instance in a Terraform file without destroying it?

In Terraform, you can change the name of an instance without destroying it by modifying the name attribute of the instance resource in your Terraform configuration file and then applying the changes using terraform apply.

How does Ansible execute the jobs?

Ansible executes jobs by connecting to remote hosts via SSH (by default) and running tasks defined in playbooks or ad-hoc commands. It utilizes modules on the remote hosts to perform tasks, collects results, and reports back to the control node.

How to connect the on-premise data center to AWS?

You can connect an on-premise data center to AWS using various methods such as VPN (Virtual Private Network), Direct Connect, or AWS VPN CloudHub. These methods establish a secure and private connection between your on-premise network and your AWS VPC.

What is a GIT tag?

A Git tag is a reference to a specific commit in a Git repository. Tags are typically used to mark releases or significant points in the project history, allowing you to easily reference and checkout specific versions of the codebase.

What is DevOps?

DevOps is a software development methodology that improves the collaboration between developers and operations teams using various automation tools. These automation tools are implemented using various stages which are a part of the DevOps Lifecycle. DevOps is a culture that improves the organization's ability to deliver the application

Why do we need DevOps?

DevOps helps organizations deliver software more quickly, reliably, and efficiently by breaking down silos between development and operations teams, automating manual processes, improving collaboration, and promoting a culture of continuous improvement. It enables faster time to market, higher-quality software, and increased responsiveness to customer feedback.

How do you configure the job in Jenkins?

To configure a job in Jenkins, you typically:

Click on "New Item" on the Jenkins dashboard.

Enter a name for the job and select the type of job (e.g., Freestyle project, Pipeline).

Configure the job settings such as source code management, build triggers, build steps, post-build actions, etc.

Save the configuration.

What are the roles you played on your laptop?

This question seems a bit broad. Typically, roles played on a laptop could include software development, system administration, content creation, communication, and entertainment.

How do you configure Ansible in Jenkins?

To configure Ansible in Jenkins, you would typically:

  • Install the Ansible plugin in Jenkins.

  • Configure Jenkins to have access to your Ansible playbooks and inventory files.

  • Create a new Jenkins job and configure it to execute an Ansible playbook as a build step.

  • Specify the playbook path and any required options.

  • Save the job configuration.

Difference between Ant and Maven?

Ant and Maven are both build automation tools, but they differ in several ways:

Ant is procedural, allowing developers to define build tasks explicitly in XML. Maven, on the other hand, follows a convention-over-configuration approach, providing a standard project structure and predefined build lifecycle.

Maven manages dependencies automatically using a central repository, while Ant requires developers to manage dependencies manually.

Maven supports project inheritance and standardized project structures, making it easier to manage complex projects compared to Ant.

Git workflow?

Git workflow refers to a set of guidelines and practices for using Git to manage code changes collaboratively. Common Git workflows include Centralized Workflow, Feature Branch Workflow, Gitflow Workflow, and Forking Workflow. Each workflow defines rules for branching, merging, and releasing code changes.

Maven lifecycle?

Maven defines a standard build lifecycle consisting of phases like validate, compile, test, package, install, and deploy, among others. Each phase represents a different stage in the software development lifecycle, and Maven plugins are bound to these phases to perform specific tasks.

Where do you find errors in Jenkins?

  • Errors in Jenkins can be found in various places:

    • Console output of build jobs: Jenkins logs the output of build jobs, including any errors encountered during the build process.

    • Build history: Jenkins provides a build history for each job, allowing you to view the status and output of previous builds, including any failed builds.

    • Jenkins system logs: Jenkins logs system-level messages, including errors, to log files located in the Jenkins home directory.

How do you integrate SonarQube in Jenkins?

To integrate SonarQube in Jenkins, you would typically:

Install the SonarQube Scanner plugin in Jenkins.

  • Configure Jenkins to have access to your SonarQube server.

  • Add a SonarQube scanner build step to your Jenkins job configuration, specifying the required options such as the SonarQube server URL, authentication credentials, and project key.

  • Run the Jenkins job, and the SonarQube scanner will analyze the code and send the results to the SonarQube server for review.

How do you configure the bucket?

  • Configuring an S3 bucket in AWS involves several steps:

    • Log in to the AWS Management Console and navigate to the S3 service.

    • Click on "Create bucket" and follow the prompts to configure the bucket name, region, and other settings.

    • Configure permissions for the bucket using bucket policies, Access Control Lists (ACLs), or IAM policies.

    • Optionally, configure features such as versioning, server access logging, encryption, and lifecycle policies according to your requirements.

    • Once configured, you can upload objects to the bucket and manage its settings through the S3 console or AWS CLI.

Differences between git rebase and git merge?

    • git merge integrates changes from one branch into another, creating a new merge commit that combines the histories of both branches. It maintains a linear history but can result in merge commits cluttering the history.

      • git rebase moves the commits from one branch to another, replaying them on top of the target branch's commits. It creates a cleaner, linear history without merge commits but can rewrite history and should be used with caution, especially in shared branches.

What is git init?

git init is a command used to initialize a new Git repository in the current directory or in a specified directory. It creates a new .git directory that stores Git metadata, including the repository's configuration, object database, and references to branches and commits.

What is a git clone?

git clone is a command used to create a copy of an existing Git repository, including all of its files, branches, and commit history. It fetches the contents of the repository from a remote location (usually another Git repository) and sets up a local repository on the user's machine.

If a file is suddenly deleted in git, how do you get it back?

If a file is accidentally deleted in a Git repository, you can restore it from a previous commit using the git checkout command:

    •          git checkout <commit_hash> -- path/to/deleted/file
      

      Replace <commit_hash> with the hash of a commit where the file still exists, and path/to/deleted/file with the path to the deleted file. This will restore the file to its state at that commit. Alternatively, if the file was deleted in the most recent commit, you can use:

         git checkout HEAD -- path/to/deleted/file
      

      This will restore the file from the most recent commit in the current branch.

What is the purpose of Docker?

  • Docker is a platform for developing, shipping, and running applications using containerization technology. Its main purposes include:

    • Providing a consistent environment for developers, enabling them to package their applications and dependencies into portable containers.

    • Facilitating the deployment of applications across different environments, from development to production, with minimal differences and dependencies.

    • Improving resource utilization by running lightweight, isolated containers on a shared host operating system.

    • Enabling scalability and agility in software development and deployment processes.

In Jenkins how can you find log files?

    • In Jenkins, you can find log files for build jobs and the Jenkins server itself:

      • For build jobs: Navigate to the specific job's page on the Jenkins dashboard, click on a specific build number, and then click on "Console Output" to view the build log.

      • For Jenkins server logs: Depending on how Jenkins is installed, logs can typically be found in the Jenkins home directory, often under a directory named logs or similar. Common log files include jenkins.log for general server logs and access_log for access logs.

By using Ansible how to deploy in Jenkins?

    • Ansible can be used to automate deployment tasks in Jenkins by integrating Ansible playbooks or roles into Jenkins jobs:

      • Install Ansible on the Jenkins server or on a machine accessible to Jenkins.

      • Write Ansible playbooks or roles to define the deployment tasks, such as copying files to remote servers, restarting services, etc.

      • Create a Jenkins job and configure it to execute the Ansible playbook or role as a build step.

      • Specify any required parameters or options in the Jenkins job configuration.

      • Run the Jenkins job to trigger the deployment process using Ansible.

What is the use of Ansible?

    • Ansible is an open-source automation tool used for configuration management, application deployment, and orchestration. Its main uses include:

      • Automating repetitive tasks such as software installation, configuration, and updates across multiple servers.

      • Enforcing consistent configurations and policies across IT infrastructure, improving reliability and security.

      • Streamlining application deployment processes by automating tasks such as code deployment, server provisioning, and environment setup.

      • Orchestration of complex workflows and interactions between different systems and services.

What is configuration management?

    • Configuration management is the process of systematically managing changes to an organization's IT infrastructure, ensuring that systems and software configurations are consistent, compliant, and up-to-date. Its main objectives include:

      • Maintaining consistency and integrity across IT environments by documenting and managing configuration items (CIs) such as servers, software, and network devices.

      • Enabling efficient management of changes, updates, and deployments through automation and standardized processes.

      • Improving visibility and control over IT assets and configurations, facilitating troubleshooting, auditing, and compliance efforts.

      • Supporting scalability and agility in IT operations by enabling rapid provisioning, configuration, and decommissioning of resources.

In the Ubuntu server, what is a public key and private key?

    • In the Ubuntu server, a public key and a private key are components of asymmetric cryptography used for secure communication and authentication.

      • A public key is shared freely and used to encrypt data or verify signatures. It can be shared with anyone.

      • A private key is kept secret and used to decrypt data or create signatures. It must be kept secure and should not be shared with anyone.

Roles and Responsibilities of a DevOps engineer:

Collaborating with development, operations, and other stakeholders to streamline the software delivery process.

Implementing and managing continuous integration and continuous deployment (CI/CD) pipelines.

Automating infrastructure provisioning, configuration, and deployment using tools like Ansible, Terraform, or Kubernetes.

Monitoring and maintaining production systems to ensure availability, performance, and reliability.

Implementing and managing tools for log management, monitoring, and alerting.

Implementing and enforcing security best practices across development and production environments.

Troubleshooting issues across the software development lifecycle and ensuring rapid resolution.

Difference between SVN and GIT:

    • Centralized vs. Distributed: SVN (Subversion) is a centralized version control system, meaning it has a single repository that serves as the central source of truth. Git, on the other hand, is distributed, meaning every developer has a complete copy of the repository, including its full history.

      • Branching and Merging: SVN uses a branching model where branches are heavy and long-lived. Git encourages lightweight branching and merging due to its distributed nature, making branching and merging faster and easier.

      • Performance: Git is generally faster than SVN, especially for operations like branching, merging, and committing, due to its distributed architecture.

      • Workflow: SVN follows a lock-modify-unlock model for file versioning, while Git uses a copy-modify-merge model. This makes Git more flexible for concurrent development and reduces the likelihood of conflicts.

      • Repository Size: Git repositories tend to be smaller in size compared to SVN repositories due to Git's efficient storage mechanism.

What version control tools are used in the present market?

    • In addition to Git and SVN, other popular version control tools in the market include:

      • Mercurial

      • Perforce

      • Microsoft Team Foundation Version Control (TFVC)

      • CVS (Concurrent Versions System)

  1. Git commit:

* git commit is a command used to save changes to the local repository. It creates a new commit containing the changes staged for commit along with a commit message describing the changes.

Git push and fetch:

* git push is used to upload local repository content to a remote repository, typically hosted on a platform like GitHub or Bitbucket.

* git fetch is used to retrieve changes from a remote repository and store them in the local repository without modifying the working directory. It updates the remote tracking branches.

How to create a repository in GitHub:

* To create a repository on GitHub:

* Log in to your GitHub account.

* Click on the "+" icon in the top-right corner and select "New repository."

* Enter a name for your repository, choose visibility options, and configure other settings as needed.

* Click on "Create repository" to finalize the creation process.

How to push a file in the GitHub flow:

* After creating a repository on GitHub, you can push files to it using the following steps:

* Initialize a local Git repository in the directory containing the files: git init.

* Add the files to the staging area: git add . (to add all files) or git add <file> (to add specific files).

* Commit the changes: git commit -m "Initial commit".

* Link the local repository to the remote GitHub repository: git remote add origin <repository_url>.

* Push the changes to GitHub: git push -u origin master.

Have you worked on Maven scripts?

Yes, I have experience working with Maven and writing Maven scripts for building, testing, packaging, and deploying Java applications. Maven is a popular build automation tool used for managing dependencies, building projects, and generating project documentation.

About branching strategies:

* Branching strategies define how code changes are managed and integrated in a version control system. Common branching strategies include:

* Gitflow: A branching model that uses long-lived branches for feature development, releases, and hotfixes.

* GitHub Flow: A simpler branching model where all development happens on the main branch (usually master or main), and feature branches are created for new features.

* Trunk-Based Development: A strategy where all changes are committed directly to the main branch, with short-lived feature branches used sparingly.

Maven lifecycle:

* Maven defines a standard build lifecycle consisting of phases such as validate, compile, test, package, verify, install, and deploy. Each phase represents a different stage in the software development lifecycle, and Maven plugins are bound to these phases to perform specific tasks.

About pom.xml:

* pom.xml is the Project Object Model file used by Maven to configure and manage a project. It contains project information such as dependencies, build settings, plugins, and profiles. The pom.xml file defines the structure and behavior of the Maven project and is located in the project's root directory.

Location and configuration file in Ansible:

Ansible's main configuration file is typically located at /etc/ansible/ansible.cfg on the control node. This file can also be located at $HOME/.ansible.cfg for per-user configuration. Additionally, Ansible can be configured to use a custom configuration file using the -c or --config option when running Ansible commands.

What are the modules you have used in Ansible?

* Ansible provides a wide range of modules for automating tasks across various systems and services. Some commonly used modules include:

* shell: Execute shell commands on remote hosts.

* copy: Copy files to remote hosts.

* template: Render Jinja2 templates on remote hosts.

* apt/yum/pacman: Manage packages on Debian/Ubuntu, RHEL/CentOS, and Arch Linux systems, respectively.

* service/systemd: Manage system services.

* docker_image/docker_container: Manage Docker images and containers.

* lineinfile: Modify lines in text files on remote hosts.

* git: Clone Git repositories on remote hosts.

* uri: Interact with REST APIs.

Where did you find an error in Jenkins?

* Errors in Jenkins can be found in various places:

* Console Output: The console output of a Jenkins job contains detailed information about the build process, including any errors encountered.

* Build History: The build history on the Jenkins dashboard shows the status of past builds, including failed builds.

* Jenkins System Logs: System logs for Jenkins are typically located in the Jenkins home directory, often under a directory named logs. Common log files include jenkins.log for general server logs and access_log for access logs.

* Notifications: If configured, Jenkins can send email notifications or integrate with messaging platforms like Slack to notify users of build failures.

What is the Jira tool?

* Jira is a project management tool used for issue tracking, task management, and project management. It allows teams to plan, track, and manage agile software development projects, as well as other types of projects such as bug tracking, task assignment, and workflow management.

As a DevOps engineer, why do we use Jira Tool?

DevOps teams use Jira to facilitate collaboration, track issues and tasks, manage project workflows, and streamline software development processes. It helps in organizing and prioritizing work, tracking progress, and ensuring that development, operations, and other teams are aligned towards common goals.

Why do we use a pipeline in Jenkins? Flow?

* Pipelines in Jenkins provide a way to define and automate the software delivery process, from code commit to deployment. They allow for the creation of continuous integration and continuous delivery (CI/CD) workflows, enabling automation of build, test, and deployment tasks. Pipelines ensure consistency, repeatability, and traceability in the software delivery process, leading to faster time-to-market and higher-quality software.

What is Release management due to production?

* Release management is the process of planning, scheduling, coordinating, and controlling the deployment of software releases from development to production environments. It involves activities such as identifying release requirements, coordinating with stakeholders, conducting testing, managing risks, and ensuring smooth deployment and transition to production.

chmod 444 <filename.txt>in root user? Change the above permissions to 777.

* To change the permissions of a file named filename.txt to 777:

plaintext shellCopy code# chmod 777 filename.txt

curlwww.google.comis not working and telnetwww.google.comis it working now?

* curl is a command-line tool for transferring data with URLs. If curlwww.google.com is not working, it could be due to network connectivity issues or firewall restrictions. However, telnetwww.google.com is working because telnet establishes a TCP connection to the specified host and port, bypassing some network restrictions that may affect curl.

I have two instances in public and private subnets, I am pinged from one server to another server and getting any response but by usingtelnet <ip>on port 23 it’s working now.

If you are able to ping from one server to another but telnet on port 23 (telnet service) is not working, it indicates that there may be a firewall or security group rule blocking incoming connections on port 23. You need to check the security group settings for the instance and ensure that port 23 (Telnet) is allowed for incoming connections.

What is SSL? And how does it work internally?

SSL (Secure Sockets Layer) is a cryptographic protocol used to secure communication over the internet. It provides encryption, authentication, and data integrity for sensitive information transmitted between a client and a server. SSL works by establishing a secure connection between the client and server through a process called the SSL handshake, which involves key exchange, encryption negotiation, and authentication.

My web servers are running in private subnets. I want to route my ELB Traffic to web servers in private subnets.

To route ELB (Elastic Load Balancer) traffic to web servers in private subnets, you can set up a target group for your web servers and configure the ELB to forward traffic to the target group. The web servers should be registered with the target group, and the ELB should be configured to route incoming traffic based on the target group's rules.

What is NAT Instance/NAT Gateway?

* NAT (Network Address Translation) Instance and NAT Gateway are both used to enable outbound internet connectivity for instances in private subnets. NAT Instance is an EC2 instance configured to perform NAT, while NAT Gateway is a managed service provided by AWS. Both translate private IP addresses of instances to public IP addresses, allowing them to access the internet while remaining hidden behind a public IP.

Why is SG?

Security Groups are fundamental to network security in AWS. It can be attached to many EC2 Instances. Security groups act as virtual firewalls for your instances, controlling inbound and outbound traffic. They can be associated with instances and control their network access. SGs are used to control access to instances based on protocols, ports, and IP addresses, enhancing security and compliance with network access policies.

If we have to install Ubuntu, where do we define the OS while launching the EC2 instance?

* When launching an EC2 instance, the OS can be specified by selecting the appropriate AMI (Amazon Machine Image) that contains the Ubuntu version desired.

What is .pem?

* .pem is a file extension used for certain types of files in Linux and related systems. In AWS, .pem files are used for SSH key pairs to access EC2 instances securely.

If we stop the EC2 instance, will the Private IP change?

No, stopping an EC2 instance retains its private IP address. However, terminating and restarting an instance may result in a change of private IP unless using Elastic IP.

What is blue/green development?

Blue/green deployment involves maintaining two identical production environments, where one (say blue) serves live traffic while the other (green) receives new code deployments and rigorous testing. Once validated, traffic is switched to the green environment, ensuring minimal downtime and enabling quick rollback if needed.

What is PaaS?

Platform as a Service (PaaS) is a cloud computing service model where a provider delivers a platform to customers, typically including operating system, programming language execution environment, database, and web server. PaaS facilitates application development, deployment, and management without the complexity of building and maintaining the underlying infrastructure.

What is shell scripting? How do we use the script for Automation?

Shell scripting involves writing scripts (sequences of commands) for the command-line shell of an operating system (like Bash in Unix/Linux) to automate tasks. For automation:

* Identify repetitive tasks suitable for automation.

* Write shell scripts using commands and logic (loops, conditionals).

* Test scripts thoroughly in a controlled environment before deploying in production.

* Use cron jobs or schedulers to run scripts automatically at specified intervals.

What is MySQL? How many ways can we use to take backup?

MySQL is an open-source relational database management system (RDBMS) that uses SQL (Structured Query Language) to manage databases. There are several ways to take backups in MySQL:

* Logical Backup: Using tools like mysqldump to export SQL statements that can recreate the database structure and data.

* Physical Backup: Copying the MySQL data directory directly, which includes all database files.

* Replication: Using MySQL replication to create backups on a slave server, keeping it synchronized with the master server.

* Backup Tools: Utilizing third-party backup tools and services that integrate with MySQL for automated backups and recovery

How do you execute a shell script within a Python script?

To execute a shell script within a Python script, you can use the subprocess module, which allows you to spawn new processes, connect to their input/output/error pipes, and obtain their return codes.

How do you execute jobs in AWS?

AWS Lambda

  • Purpose: Run serverless code in response to events.

  • Key Steps:

    1. Create a Lambda function.

    2. Deploy via AWS Console, CLI, or Infrastructure as Code.

    3. Invoke manually or set up event triggers (API Gateway, S3, CloudWatch).

AWS Batch

  • Purpose: Run large-scale batch computing jobs.

  • Key Steps:

    1. Create a Compute Environment.

    2. Create a Job Queue.

    3. Define a Job Definition.

    4. Submit jobs via AWS Console or SDK.

AWS Step Functions

  • Purpose: Orchestrate complex workflows by coordinating multiple AWS services.

  • Key Steps:

    1. Define a State Machine.

    2. Deploy the State Machine.

    3. Start execution via AWS Console or SDK

What steps do you take when a build fails in Jenkins?

  • Review the Build Logs: Access and analyze the console output for error messages and stack traces.

  • Identify the Cause: Look for common issues such as code errors, failed tests, or configuration problems.

  • Check Recent Changes: Review recent commits or merges and communicate with team members.

  • Reproduce the Issue Locally: Try to replicate the build failure on your local machine.

  • Verify Jenkins Configuration: Check build scripts and environment variables for errors.

  • Rerun the Build: Retry the build and consider cleaning the workspace.

  • Roll Back Changes: Revert recent changes if they are identified as the cause.

  • Implement a Fix: Make necessary code or configuration updates to resolve the issue.

  • Monitor the Fix: Ensure the issue is resolved and monitor subsequent builds.

  • Document the Issue: Record the cause and resolution for future reference.

    How do you integrate LDAP with AWS and Jenkins?

Integrating LDAP with AWS

  1. Set Up AWS Directory Service: Create a directory using AWS Managed Microsoft AD or Simple AD.

  2. Configure IAM Roles and Policies: Create roles and policies for LDAP users.

  3. Enable AWS IAM Identity Center (if applicable): Integrate your LDAP directory.

  4. Set Up SSO with LDAP: Configure SSO settings for LDAP authentication.

Integrating LDAP with Jenkins

  1. Install LDAP Plugin: Install via "Manage Plugins".

  2. Configure LDAP Plugin: Set LDAP server details and user search parameters in "Configure Global Security".

  3. Test LDAP Connection: Verify settings using "Test LDAP Settings".

  4. Configure Authorization: Set up access control using LDAP groups.

  5. Verify User Authentication: Log in with LDAP credentials to confirm integration.

.