Don't wanna be here? Send us removal request.
Text
PAF Tute 3
aspects of the code quality
Maintaining code quality matters in many ways. Such as :
The long-term usefulness and long-term maintainability of the code
Minimize errors and easily debugged
Improve understandably
Decrease risks
When we talk about the aspects of code quality, there are several facts that need to focus attention.
The aspects of the code quality are - :
Weighted Micro Function Points (WMFP)
Halestead complexity measurements
Cyclomatic complexity
Line of the codes
Lines for the codes per method
Weighted Micro function points , also known as WMFP, is a modern software algorithm is conceived by the logical solutions in the year 2009, which is a successor to solid scientific ancestors methods such as COCOMO and COSYSMO, maintainability index, cyclomatic complexity, function points and hurried complexity. WMFP produces more accurate results than traditional methods for classifying software requires less configuration. A lesser knowledge is fine with the end user. Most estimates based on the automatic measurements of an existing code.
The Halestead method is one of the aspects of the code quality that must be discussed here. The software matrices introduced by the Matrix Howard Halestead in 1977 as part of his treatise on founding an empirical science of software development.
Make the note that the statistics of the software must reflect the implementation or expression algorithm in different languages. Metrics are statically calculated from the code. The goal of the Halestead was to identify measurable properties of the software and the relationship between them.
Cyclomatic complexity is that of the other aspect of code quality. Cyclomatic complexity is a source code complexity measurement that coalesces with a number of coding errors.
It is calculated by developing a control flow chart of the code that measures the number line-independent paths via a program module. It is lower the Cyclomatic complexity program and lower the risk of changing and easier understand. Cyclomatic complexity can represent a formula.
The formula is -
* E - N + 2 * P
Let's identify the above formula.
E - number of edges in the flow chart.
N - number of nodes in the graph
P - number
When we talk about the codes, quality must be an important point that needs to be improved. It is must to keep these characteristics when you code. These characteristics are directly anchored in quality of the code.
They are -
Efficiency
Reliability
Robustness
Portability
maintainability
readability
Here we are talking about the characteristics.
Efficiency -
It is very important to ensure that the code is very efficient. The efficiency of the code covers different things. They are speed and reliability.
These are the ways we ensure that the code is really efficient -
Delete unnecessary or redundant code
Write useable codes
Limit resource use
Use suitable data types, functions, looping at a suitable location.
Reliability -
Ability to perform consistency and error-free operations and every operation every time runs.The software will be less useful if the code functions work differently when executed each time with the same input and in the same environment. Hence the result will throw off the throwing every mistake.
To keep the reliability in your code, these steps must follow -
Take a lot of time to assess
Trust the code carefully and thoroughly in all possible ways.
Use the correct exception and error handling.
Robustness -
Robustness means the ability to process the errors during the execution of the program, even under unusual circumstances.
To maintain robustness in your code, these steps should follow -
Test the software under all circumstances, such as the usual and unusual circumstances
Use the correct exception and error handling
Provide clear and understandable error messages to allow user to debug more easily program.
Portability -
portability is the ability to work on so many different machines and different operating systems as possible.
To keep the portability in your code, these steps must follow -
start from the very beginning.
write the code could work on any possible environment.
Maintainability -
Maintainability means that the code can easily add new functions and change existing functions and fix the bugs. There are important facts that must be used in maintainability.
There are - The written codes must be
easy to understand
Easy to find what needs to be changed
Easy to make changes
Easy to check whether the changes have introduced any bugs.
To maintain the maintainability in your code, these steps must follow -
Good naming of variables, method and class names
Use the correct notch and style.
Good technical documentation
Write the correct comment or summary descriptions at the top of the files, classes and functions.
Readable
Readable is the ability to follow the code easily, quickly and clearly understandable by someone who has not seen someone for a while.Ensures that everyone can understand the code written by everyone.
To maintain good readability in your code, follow these steps -
Use the correct variables, names of methods and classes
Use consistent indentation and markup styles
Write the correct comment and a summary at the top of the files, classes and functions
Tools for maintain the code quality
Today the development of the functions of information technology, tools have been introduced to preserve the code quality.
We can name some examples for those tools. They are -
Checkstyle
PMD
FindBugs
SonarQube
Let's talk about every tool to have a good knowledge about them.
Check style -
Check-style is a free and open source static code analysis program that is used in the software development, check if that Java code meets the coding conventions you have established. It automates a crucial but boring task to check the Java code. This is a popular tool used to automate the code review process.Check style comes with the predefined rules, which help in maintaining coding standards rules do not take into account project-specific requirements. Check the style used as an Eclipse plugin or part of a build system such as Ant, Maven and the Gradle.
PMD -
This is a utility for analyzing static codes. It is able to automatically detect a wide range of possibilities bugs, unsafe and non-optimized code. PMD examines Java source code and looks for possible problems, such as possible bugs, dead code, sub-optimal code, over complicated expressions, duplicated code. PMD focuses on preventive defeat detection. It has a rich and highly configurable set of rules. PMD used in Eclipse, intelliJ, Maven, Gradle and Jenkins.
FindBugs -
FindBugs is an open source Java code tool. It has a completely different focusaimed at detecting possible errors, performance problems, detecting a large number of species common hard to find coding errors. There are the types of errors that FindBugs is looking for.
These categories of errors -
Thread synchronization
null pointer dereferences.
Infinite recursive loops.
abuse of api methods
So the FindBugs is used to identify hundreds of serious defects of large applications categorized into four parts.
Those four parts are -
Scariest
Scary
Troubling
of concern
SonarQube -
SonarQube is an open source platform that was originally launched in 2007. SonarQube is used by developers to manage the quality of the source code. It can be used as a shared central quality system management. SonarQube enables code-quality management for every developer on the team. It supports a wide variety of languages, such as Java, C, C ++, C #, PHP, Flex, Groovy, JavaScript, Python, Pl, SQL. SonarQube offers fully automated analysis tools. SonarQube integrates well with Maven, Ant, Gradles. It uses FindBugs, Checkstyle and PMD collect and analyze code for bugs, possible violation of the codestyle policy. It examines and evaluates various aspects of your source code of small style details, possible bugs, code defects for critical design errors, lack of test coverage, excessive complexity.
Dependency / Package Manager and Package Manager Tools
Package management (package management system) is a collection of software tools that automates the process of installing, upgrading, and configuring computer programs for a computer operating systems in a consistent.
Package manager deals with the packages. The distribution of software and data in the archive files.Packages contains metadata.Metadata represents software names, description of the purposes, version number, supplier, checksum, list of dependencies for the correct execution of the software. Metadata stored in a local packet database. It maintains a database with software dependencies and version data to prevent software from missing prefrontations.
Packages work closely with software sources, binary repositories and app stores. It is useful for Linux and other Unix-like systems. Package management systems categorized first by the package format. Those sub formats arebinary, source code, hybrid
.Let's look at some examples of those categories -
1) Binary packages -
Linux
Windows
Macos
B5D
Solaris, illumos
Android
2) Source code based -
Macos
3) Hybrid systems-
Nix Package Manager
Upkg
MacPorts
Portage
emerge
collective knowledge frameworks
Build tools and Build Automation
Build tools are the programs that automate the creation of executable applications source code. Building involves compiling and linking the code into a usable or executable form.Build automation is the process of automating the creation of a software build and associated processors that include:
Compilation of computer source code into binary code.
Binary code of the packaging
Perform automated testing.
Building automation is in principle divided into two main categories. They are -
Build automation utility (make, rake, cake, ant, gradle)
Build automation servers
Building automation tool -
The purpose of the build automation utility is to build and generate linking it through activities such as compiling source code.
Build automation servers -
These servers are generally Web-based tools that run build automation utilities on a planned or triggered basis. A continuous integration server is a type of build automation server. It depends on the level of automation is the following classification possible. Those classifications are -
1) Makefile level
On make-based tools
Non-resource-based tools
2) Create tools for creating scripts (or makefile).
3) Continuous integration tools
4) Meta-build tools or packing ways
and the automation tools for building also enable the automation of simple, repeatable tasks. It is calculated to achieve the goal by performing the tasks in the correct and specific order and perform each task.
There are two categories of construction tools. They are -
Task-oriented tools
product-oriented tools
Task Oriented Tools describe the dependency on networks in terms of a specific set task. Product-oriented tools describe the things in terms of the products that they generate.
Build automation servers have three types. They are
On demand automation
Scheduled automation
Initiated automation
Build tools used in the Industry
There are multiple build tools that are used in information technology can give some examples of the popular construction tools in the industry.
Some examples for the construction tools -
Invoke
Open Build service
WebPack
Azure DevOps
Cake
cmake
Gradle
Ant
Buildr
Maven
MSBuild
NAnt
Rake
Care
Jam
Visual Build
Meister
LuntBuild
FinalBuilder
Scons
Packer
Gulp
Gunt
Broccoli
Sbt
SANDMAN
Let's look at every building tool used in today's industry.
1) Invoke
Invoke is a Python (2.6+ and 3.3+) task execution tool & library, inspired by several sources to come to a powerful & clean feature set. Like Ruby's Rake tool and Invoke's own predecessor Fabric 1.x, it provides a clean, high-level API for executing shell commands and defining / organizing task functions from a task.py file.
2) Open Build service -
The Open Build service (OBS) is a generic system for compiling and distributing packets of resources in an automatic, consistent and reproducible way
3) WebPack -
Webpack is a module bundler for modern JavaScript applications. It takes the dependencies and generates a dependency graph with which web developers can use a modular approach for their web application development.
4) AzureDevOps -
The Azure DevOps project presents a simplified experience in which you take your existing code with you and Git repository, or choose from one of the sample applications to make continuous integration (CI) and Continuous Delivery (CD) pipeline to Azure.
5) CMake -
CMake is platform-independent free and open source software for the management of the building process software using a compiler-independent method. It is designed to support directory hierarchies and applications that depend on multiple libraries. It is used in combination with native build environments such as make, Apple's Xcode and Microsoft Visual Studio. It has minimal dependencies, for which only a C ++ compiler is required on its own build system.
6) Gradle -
Gradle is a project automation tool that builds on the concepts of Apache Ant and Apache Maven introduces a Groovy-based domain-specific language (DSL) instead of the more traditional XML form to declare the project configuration. Gradle uses a focused acyclic chart ("DAY") to determine the order in which tasks can be performed.
7) Ant -
Apache Ant is a software tool for automating software build processes. It originally came out the Apache Tomcat project in the beginning of 2000. It was a replacement for the tool for making Unix make-ups, and was created due to a number of problems with the Unix brand. It looks like Making but is implemented using the Java language, requires the Java platform and is most suitable for building Java projects.
8) Maven -
Maven is a build automation tool that is mainly used for Java projects. The word maven means 'accumulator of knowledge' in Yiddish. Maven addresses two aspects of building software: first, the describes how software is built and secondly it describes the dependencies
Build Life cycle of Maven
A build life cycle is a well-defined sequence of the phases that defines the sequence which goals must be implemented. In Maven, the construction cycle consists of the six steps. The steps are -
prepare resources
validate
compile
Test
package
Install
deploy
Let's explain each phase in short -
Preparing - means - Copying sources can be adjusted at this stage.
validate - Validates whether the project is correct and whether all necessary information is available.
compile - The source code compilation is executed at this stage.
Test - Test the compiled source code that is suitable for testing the framework.
package - In this phase the JAR / WAR package is made as stated in the packaging in POM.xml.
install - This phase installs the package in a local / external folder repository.
Implement - Copies the last package to the external repository.
Each phase has pre - and post-phases to register goals. Maven has three standard life cycles -
Clean
Default (or build)
site
A goal represents a specific task that contributes to building and managing a project. It can be tied to zero or more construction phases. A goal that can not be tied to a construction phase performed outside the building life cycle by direct invocation.
Clean life cycle
When we execute mvn post-clean command, Maven calls the clean life cycle that consists of the next phases.
Pre-clean
clean
Clean mail
Maven clean goal (clean: clean) is bound to the clean phase in the clean life cycle
clean: clean goal removes the output from a build by removing the build directory. Similarly when
mvn: command command is executed, Maven removes the build directory.
Default (or build) cycle -
This is the primary life cycle of Maven and is used to build the application. It has the following 21 phases.
Those phases are respectful -
Validate - Validates whether the project is correct and whether all necessary information is available to complete the building process.
initialize - Initializes the build status, for example set properties.
generation sources - Generate each source code to be included in the compilation phase.
Process sources - Process the source code, for example filter every value.
generating means - Generate sources to be included in the package.
Process sources - Copy and process the resources in the destination directory, ready for the packaging phase.
compile - Compile the source code of the project.
process classes - Process the generated files after the compilation, for example to do byte code improvement / optimization of Java classes.
generating test sources - Generate each test source code to be included in the compilation phase.
process test sources - Process the test source Process and use the package if necessary in an environment where integration tests can be carried out run.
test compile - Compile the test source code in the test destination directory.
process test classes - Process the generated files from the compilation of the test code files.
test - Perform tests with a suitable unit testing framework (J unit is one).
prep package - Perform all the operations necessary to create a package for the actual packaging.
package - Take the compiled code and package it in its distributive format, such as a JAR, WAR or EAR file
pre-integration test - Perform the required actions before integration tests are performed. For example, setting the required environment.
integration test - Process and use the package if necessary in an environment where integration tests can be carried out run.
post-integration test - Perform the actions required after integration tests have been performed. For example, clean up Area.
verify - Perform all checks to verify that the package is valid and meets quality criteria.
install - Install the package in the local repository, which can be used as a dependency in other projects local.
deploy - Copies the last package to the external repository to share with other developers and projects.
There are some concepts that relate to the Maven Life cycles. They are -
When a stage is called via Maven command, for example mvn compile, only phases up to and including that phase will be carried out.
Different maven targets depend on different phases of the Maven life cycle on the type of packaging (JAR / WAR / EAR).
Life cycle site -
Maven Site plug-in is generally used to create new documentation to create reports, to implement site, etc.
The phases that are part of the life cycle of the site -
pre-site
website
places on the site
website deployment
Maven
If we search for the Maven, there are different definitions for miracle. In short, Maven is one site and a documentation tool. Maven expands Ant to have your dependencies downloaded. Maven is a set of reusable Ant scripts.
There are several terms in Maven :
Build life cycle -
The Maven build follows a specific life cycle to deploy and distribute the target project. That are the initiated steps -
default life cycle - the most important life cycle because it is responsible for project implementation
clean life cycle - to clean the project and delete all files generated by the previous build
site life cycle - to make the site documentation of the project
each life cycle has a number of sequence phases. For the above-mentioned life cycles -
default (or build) - 21 sequence phases
clean - 3 phases
site - 4 phases
Maven build Phase
A phase in a Maven represents a phase in the life cycle of the maven. Each stage is assigned for one other task.
We can show some examples for phases -
validate - check that all information needed for the build is available
compile - compile the all source code
test-compile - compile the test source code
test - perform unit tests
package - compiled source code in the distributive format (pot, war, ...)
integration test - process and implement the package if necessary to perform integration tests
install - install the package in a local repository
deploy - copy the package to the external repository
examples above for the default life cycle.
Maven building goal -
Each goal is represented for the special task. We can name a few phases and default goals tied to them.
compiler: compile - the compilation plug of the compilation plugin is bound to the compilation phase
compiler: testCompile - is bound to the test compilation phase
accurate test - is bound to test phase
install: install - is bound to install phase
pot: pot - is tied to package phase
we can find some assignments that are related to some phases in these life cycles.
mvn help: describe -Dcmd = PHASENAME - use this command when a list is made of all targets referenced a specific phase and their plug-ins.
mvn help: describe -Dcmd = compile - use this command when to specify all the goals to be compiled phase.
After you can get the final output as follows -
compile' is a phase corresponding to this plugin: org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
0 notes
Text
PAF tute 2
1.What is the need for VCS?
Version control systems are a category of software tools that help a software team handle changes in the source code over time. Version control software keeps track of all changes to the code in a particular type of database. If an error is made, developers can turn the clock and compare previous versions of the code to help resolve the error while minimizing all team members.
2.Differentiate the three models of VCSs, stating their pros and cons
3.Git and GitHub, are they same or different? Discuss with facts.
Git and GitHub, are not same. They are different. Because, Git is a distributed version management tool that can manage the source code history of a development project, while GitHub is a cloud-based platform built around the Git tool.
Git is a tool that a developer installs locally on his computer, while GitHub is an online service that stores code of computers running the Git tool.
The main difference between Git and GitHub is that Git installs open-source tool developers locally to manage source code, while GitHub is an online service to which developers using Git can connect and upload or download resources.
Another way to examine the differences between GitHub and Git is to look at their competitors. Git competes with centralized and distributed version management tools such as Subversion, Mercurial, ClearCase and IBM's Rational Team ConcertOn the other hand, GitHub competes withcloud-based SaaS and PaaS offers, such as GitLab and Atlassian's Bitbucket
Compare and contrast the Git commands, commit and push
Commit - Update the file in the local repository (on your computer)
Push - Update the file in the external repository (in server, eg: github)
In fact, Git commit "records changes to the repository" while you push "remote updates refs along with associated objects". So the first is used in combination with your local repository, while the latter is used to communicate with an external repository.
Here is a clear picture of Oliver Steele, explaining the git model and the commands:
4.Discuss the use of staging area and Git directory
There are three areas where file changes can occur from the git viewpoint: working directory, staging area and repository.
When you work on your project and make changes, you have to deal with the working directory of your project. This is the project directory on the file system of your computer. Any changes you make remain in the working directory until you add them to the collection area (via the git add command). The collection area can best be described as an example of your next commit. This means that when you do a git
commit, git will take the changes in the staging and make the new commit of those changes. A practical use of the collection area is that you can refine your commits with this. You can add and remove changes from the phase area until you are satisfied with how your next commit will look, at which point you can do git commit. And after you've made your changes, they go to the .git / objects directory where they are stored as commit, blob, and tree objects
There are many uses of staging in git. Such as;
With staging you can split one big change into multiple commits. Let's say that you have worked on a big change, involving a large number of files and quite a few different subtasks. You did not do this at all - you were "in the zone," as they say, and you did not want to think about splitting the commits in the right way. (And you're smart enough not to let the whole thing honk!). Now the change is all tested and working, you have to do all this well, in different clean commits, each focused on one aspect of the code changes. Use the index to make each series of changes and then register until no more changes are in progress.
Staging helps assess changes - Staging helps you "check" individual changes while viewing a complex commit and focusing on the things that have not yet passed your assessment.
Staging helps you secretly small changes - Let's say you're in the middle of a somewhat big change and you're being told about a very important bug that needs to be resolved as quickly as possible. The usual recommendation is to do this on a separate branch, but let's say that this solution is actually just a few lines and can be tested just as easily without affecting your current work. With git you can quickly make and record that change, without committing all the other things you are still working on.
Staging helps with merging with conflicts. When a merge takes place, changes that go well together are updated both in the staging area and in your work structure. Only changes that have not been neatly merged (i.e., have caused a conflict) are displayed when you run a git-diff or in the top left of git gui. Again, this lets you focus on the things that need your attention - the merging conflicts.
Explain the collaboration workflow of Git, with example
Discuss the benefits of CDNs
Companies that are engaged in media, entertainment, gaming, software, online retail and much more with a digital rich content on their website and want to deliver them quickly and reliably to their audience, can use CDN. Consumers want a high-quality online experience, whether they watch a movie, stream an event, play a game or shop online. The use of CDNs results in an increase in performance, giving end-users an improved consumer experience.
Companies that daily see a huge amount of traffic on their website can use CDN to their advantage. When a large number of users have simultaneous access to a web page on certain specific content, such as a video, a CDN can send that content to each of them without delay. Here are some of the benefits of using a CDN,
Your server load will decrease - As a result of strategically placed servers forming the backbone of the network, the companies can have an increase in capacity and the number of simultaneous users that they can handle. In essence, the content is spread over multiple servers, as opposed to unloading on one large server.
Content delivery becomes faster – Since there is a high reliability, operators can deliver high-quality content with a high level of service, low network server loads and therefore lower costs. In addition, jQuery is ubiquitous on the internet. It is very likely that someone who visits a particular page has already done so in the past using the Google CDN. Therefore, the file is already stored in the cache by the browser and the user does not have to download it again.
Segmenting your target group is easy - CDNs can deliver different content to different users, depending on the type of device requesting the content. They are able to detect the type of mobile devices and can provide a device-specific version of the content.
Lower network latency and packet loss - End users experience less jitter and improved stream quality. CDN users can therefore deliver high-definition content with a high Quality of Service, low costs and a low network load.
Higher availability and better usage analyzes - CDNs distribute items dynamically to the strategically placed core, fallback and edge servers. CDNs can give more control over the delivery of assets and network load. They can optimize per-customer capacity, view real-time tax and metrics views, reveal which assets are popular, show active regions, and report exact delivery data to customers. CDNs can thus offer 100% availability, even in the event of major power outages, network or hardware failures.
Storage and security - CDNs provide secure storage for content such as videos for businesses that need it, as well as archiving and enhanced data backup services. CDNs can secure content through Digital Rights Management and restrict access via user authentication.
How CDNs differ from web hosting servers?
Web hosting is to host your web on a server so that people can access the Internet, while CDN increases the delivery speed of your web content around the world.
CDN currently provides only the static part of your website, but Google is planning to store the whole page in the cache, including the content of your web pages, web servers on the other, and contains all of your web-related content.
Web content is usually hosted on a single server, but CDN content is distributed around the world in multiple hosted environments.
1 note
·
View note