Discussion of the Need to Practice Agility and Embrace
a New Method
Discussion of the Need to Practice Agility and Embrace
a New Method
While working at the data security
firm, I have learned about the most effective method of safeguarding against
data breach is the use of hybrid solution. Initially, our security firm had
implemented asymmetric encryption to curb data breach. However, the former
technique has not been effective since the firm lost significant information through
either unauthorized access or theft. Therefore, I have learned that a hybrid
data encryption and decryption method would improve the development of
algorithm since it combines both the asymmetric and standard cryptographic. The
newer data security approach reinforces the cipher weaknesses of each
encryption technique. Consequently, I have the responsibility of sharing this
newly learned data security system with my colleagues at the workplace.
Although my colleague at work is
unwilling to learn and apply this effective data security technique, I would
advise her to embrace organizational agility in several ways. Firstly, the
ability and readiness to learn from past data breach experiences need to
benefit our IT security firm (Harraf, Wanasika, Tate, & Talbott, 2015).
Secondly, I would persuade her that improving one’s self-awareness is an
integral part of thinking with agility, thus attempting to challenge formerly
learned knowledge of acting, and further unlearn from it. For example, I would
challenge her to embrace this new hybrid data security by creating a stable
autonomy and role clarity, which would compel her to maintain a collaborative
learning culture (Harraf et al., 2015). I believe that if she used strategic
agility, we would successfully change our team’s priorities. Therefore, we
would help empower each other to solve this data breach security that we have
recently witnessed if we get the necessary support from the management team.
Why Strategic Management Becomes
Important to Today’s Corporations
management is of paramount importance in today’s corporations. It assists in
providing the company with the right direction and required leadership to
define the strategies that align with the mission, vision, and objectives (Wheelen et al., slide 4). Strategic management offers employees a sense of unification with the
business organization, causing them to express their total loyalty for the
attainment of commonly defined goals. Likewise, strategic management is better
placed to invoke organizational workers to show their best in respect to work
performance and productivity (Wheelen et al., slide 17). Therefore, the
organizational leadership needs to plan for change based on the consideration
of strategic management as a continuously changing process.
Necessary Information for Proper
Formulation of Strategy
Before attempting to proceed with formulation of strategy, its precise
definition o is critical. All the information should be customized in respect
to an organization’s mission and objectives while considering the immediate
environment (Wheelen et al., slide 19). Additionally, a proper definition of
the strategic objectives would guide the corporate leadership towards a
particular adaption action plan, outlining a roadmap and assigning roles.
Indeed, a strategic plan would demonstrate the need for the organization to
measure its effectiveness and progress over a formulated competitive strategy
(Wheelen et al., slide 26). Apart from that aspect, information on formulation
of strategy is a necessity
because it acts as a force that drives an organization to examine its prospect
of adaptation in the predictable future (Wheelen
et al., slide 20). Consequently, the
same information would help prepare corporate for change through allocation of
appropriate capital budgeting, instead of waiting in actively for the
outcompeting market forces.
L., et al. Strategic Management and
Business Policy: Globalization, Innovation, and Sustainability. 15th ed.,
Pearson Education, Inc., 2018.
Data analysis is a critical component in understanding
and extracting the actual meaning from business insights in the modern business
setting. The reason is that data analysis provides the basis for business
success. However, large volumes of data are created daily, with just less than
1% being analyzed and adopted to improve the business’s value. Regardless, this
still provides essential information for achieving desired goals of any
organization. Thus, knowing how to collect, analyze and interpret data remains
light of the above, a researcher from Mini Project Part 2 Company seeks support
in data analysis. The company has a new dataset
of 1000 California properties that contain the same variables as their first
dataset. Now the researcher is interested in knowing which characteristics of
California properties combined that best explain the variation observed in the
median housing value of homes in California. They would like to help develop
multiple linear models, including the predictors that best explain median
housing value and can appropriately predict the median housing value of a new neighborhood.
They will also share this model with real estate agents, so it should be simple
to understand it effortlessly. Thus, the proposed model should be complex
enough for good predictions and description of the population (with all the
suitable properties) but simple enough that it is easily understood. Hence, it
is essential to ensure that the analysis stepsare clear and justifiable such
that there are no questions about why you chose the model that you present
compared to any other possibility.
The data for the study was collected from the
Quercus project page. A population of 20,433 California homes was selected
before sampling the data to obtain a sample size of 1,000 California homes for
The study utilized 14 variables to help come up with
the appropriate model. The variables were randomly selected from the company
data and processed through four stages to build a reliable model. In the first
case, all the 14 variables were subjected to regression analysis. The model was
then tested for reliability. Scatterplots were used to assess the residual
data. Afterwards, several transformations were made to remove data that
exhibited excessive multicollinearity. Finally, r-standard plots were performed
to examine disparity of the data from the mean to help assess the quality of
the final model. The following are the variables utilized in the model;
Longitude – which represents the longitude where the
home/region is located
Latitude – that represents the latitude where the
home or region is located
Housing medium age – that represents the median age
of houses in the area of this
Total rooms – the variable represents the total
number of rooms in the homes in this area
Total bedrooms – the variable represents the total
number of bedrooms in the homes in this area Population – this variable
represents the population of the area where this home is located
Households – the variable represents the number of
households in the area this home is located
Median income – this represents the median income of
households in the area (in ten-thousand dollars) Median house value – the
median house value in the area where this home is located
Near Bay – the variable represents the indicator of
whether the home or region is located near a bay
Near ocean – it represents the indicator of whether
the home/region is located near the ocean
One ocean represents whether the home/region is
located within a one-hour drive of the ocean.
Inland – it represents the indicator of whether the
home/region is located inland. Further, the X variable was employed to act as
an identifier for each observation made on data.
Accordingly, the following model was used in setting
up the assessment;
There are different approaches towards achieving model
validation. The study utilized the split data method to implement data
validation by utilizing the functions in R-Studio Statistics software. In this
approach, the housing.csv data was split into two parts: training data and
validation. The predicted probability (score) for sample validation was then
performed using the considered model. The score file was ranked in a descending
order using the estimated probability. The ranked file was split into deciles
and observations in each decile ascertained and the cumulative events assessed
for each decile. The cumulative events’ gain score was determined, which was
divided by the percentage score of data for each decile determined. Lastly, KS
Statistics was performed to measure the degree of separation between the
negative and positive distribution.
The model was tested to assess the existence of
multicollinearity in the model. Multicollinearity exists because of the
inter-dependence in independent variables. Thus, this situation renders the
model invalid when the degree of correlation is high.
In this section, the results of the analysis are
presented. The aim is to illustrate the information obtained from analyzing the
company data. The section also describes the data, processes involved in
obtaining the results, and assessing the quality of the model.
Regression Analysis – based on the regression model
under (i), the analysis produced the following regression output.
Results of Regression Analysis
Figure 1 shows a summary of regression output as
obtain from the R-Studio analysis report. As such, the following is the
The minimum value was -255,804, and the maximum value of
334,858. The quantile ranges are -40354 to 30449.
Distribution of Data
Distribution assessment sought to ascertain how the
data is distributed. The scatterplot was utilized in achieving this, as
displayed in figure 2 below.
Scatter Plot 1
2 shows the distribution of the housing medium value. Figure 3 below is a
normal probability plot. This was used to assess the distribution of sample
data against the normal line. The quantile to quantile plot assesses how data
is distributed from the expected normal distribution line.
Normal Q-Q Plot
5 and 6 show the r-standard plots for the data. The r-standard plots were
utilized to assess the standard dispersion of data from the mean. Figure 5
shows the r-standard distribution against deciles, and figure 6 shows the
r-standard distribution against the quantiles.
Figure 4: Housing Data
Figure 5: R-Standard
goodness of Final Model
The goodness of the final model was assessed by
examining the goodness of fit. This is illustrated in figure 6 below. The best
model is determined by how best it cuts through the majority of data.
Figure 6: improved
version Scatter plot 2
Discussion and Conclusion
The section presents a summary of the findings and interpretation
of the analysis results as demonstrated under the results section. The purpose
is to find meaning and assess the relevance of the information towards
addressing the current problem faced by the company. It also discusses the
flaws in the data.
The final regression analysis revealed the following
implies that the housing value decreases by 3.367 units when other factors are
not part of the environment. Equally, latitude and longitude have negative
implications on the housing value as any change in these factors results in a
significant decrease in the value by 3.845 and 3.442 units, respectively. The
exact consequences are realized with total rooms, change in population size,
and households which significantly decreases house value by 3.507, 2.845, and
1.611. However, a unit change in total bedrooms, median housing age, and median
income increases housing value by 1.296, 1.012, and 3.713 units. Equally,
houses near the Bay, ocean, and ocean have a higher value than those away from
this location. Further, the significance of each coefficient for the
independent variables at p-value <0.05 reveals that only latitude,
longitude, median housing age, total bedrooms, population, median income, and
nearby ocean are the only statistically significant factors. The rest are insignificant
at a 5% significance level. Thus, it implies that these are the only factors
influencing the company’s housing value changes.
quality of data, most of the housing value data were positively skewed, as
revealed under Figure 2. Examining the quantile to quantile plot (figure 3)
revealed that the data is away from the expected normal distribution although
positive skewed. Equally, the r-standard distribution plot indicates that most
of the data were dispersed from the mean.
Thus, the most appropriate model for consideration is the
The model is the best fit as it cuts
through most of the data points, as illustrated in figure 6. Therefore, this has implications that there
is a possibility of fluctuation in value with changes in data distribution.
In summary, the analysis was not without challenges.
Most of the data were sourced from open sources without validation. Further,
examination of the study indicates that other factors should be considered in
ascertaining appropriate predictors for the housing value. However, the study
was limited on the current factors for lack of time to incorporate others.
Thus, future studies should focus on incorporating a wide range of other
Web-Based Application and Infrastructure for
Whereas the landscape of web servers is
composed of many divergent technologies, Apache is debatably considered one of
the best web-server. Its development is open-source Apache HTTP software, which
retrieves source codes freely through viewing and collaboration. Unlike other
web servers like NGINX, Apache HTTP is regarded the most popular at 50% since
it can handle large quantities of data traffics with minimal configurations.
Therefore, Apache can be deployed with ease on Linux, macOS, and Windows.
Overall Product Architecture and Key Components of Web-Based
Product architecture entails the
organization or the chunking of Apache’s functional elements. In particular, it
might encompass the interactions of all chunks or elements. Certainly, it takes
an integral role in scheming, selling, as well asrevampingthe offer of a new
product.Therefore, the web-based infrastructure means the involved strategies
of mapping the function into the product’s form.
1. Picture of web-based product architecture
Generally, there are two basic
types of product architecture, including modular as well as integral.
Concerning the modular type of product architecture, there is a well-described
component whose purpose is to interface functionally within self-contained
modules. The product is organized into several modules in a bid to create and
develop a particular goal. Usually, the interaction of all encompassed modules
brings about the overall purpose of a product, leading to many benefits like
outsourcing and allocating tasks. In short, the modular type of product
architecture brings many other advantages, ranging from economies of scale,
standardization to mass customization.
However, regarding the integral
product architecture, functions of the product are ideally distributed among
the physical elements. Therefore, there is a greater anticipated complexity of
mapping in the association of components coupled with functions. Thus, it is
easier to optimize the whole product architectural type since each component is
adapted for a specific function.
web-based application (Apache) has four critical components associated with
streaming architecture as follows.
The message broker or stream processor – this
particular component derives data from a producer source, converts it into a
standard message format, and additional streams it continuously while other
components eavesdrop and consume the passed-on messages. The latest
hyper-performing message brokers include Apache Kafka, as illustrated by figure
2. Therefore, streaming brokers support extremely high performance based on
persistence coupled with high capacity Gigabytes per second.
Figure 2. The message broker or stream
Batch together with Real-time ETL features– this
component is relevant forcombining, changing, and configuring numerous messages
before the features of SQL-oriented analytics tools could structure them. The
ETL tools are useful for receiving queries from varied users, thus creating the
predictableoutcomes. Undeniably, this is done by queuing together with applying
the queries, as shown in figure 3.Therefore, Apache Storm is an example of this
kind of component streaming architecture.
Figure 2. Batch and real-time ETL tools
Data analytics or Serverless query engine – this
component is responsible for analyzing the streamed and consumed data to
provide value regarding the analytics tool. For example, Kafka Connect is
useful in streaming topics directly into Elastic search, creating the correct
data types automatically through mappings.
Streaming data storage – the advent of low-cost storage
technologies has occasioned many organizations to commence the process of
storing their streaming event data in Kafka, as demonstrated by figure 4 in the
data lake storage.
Figure 4. Modern streaming architecture
Outlining the Types of Problems Apache Solves for the Enterprise
The implementation of modular architecture in Apache
offers it the capacity to resolve the problem of shifting the dominant
requirements and environments of a business. When undertaking a technological
transformation in a business, a flexible and scalable architecture will give
room for impacting new workflows.
The use of Apache brings about a better user experience
associated with shifting the application to the mobile platform. Therefore, the
shift to a mobile-based platform is easily aided by web-based architecture,
leading to increased productivity coupled with a timely decision-making
Apache gives room for the input of solid defense
systems, which are associated with solving the challenging security issues on
the web. For example, there are embedded application vulnerability tests whose
capabilities go beyond the act of recognizing loopholes in the system.
With the emergence of big data, it has become essential
for an enterprise application to utilize curating, organizing, and centralizing
data projects in Hadoop platforms.
The changing technology has necessitated the
introduction of Artificial Intelligence through the usability of highly adapted
enterprise applications and Software as a Service (SaaS). Indeed, Apache
functions in line with the system requirements of the Internet of Things (IoT)
and micro services.
The implementation of web-based applications has
resolved interoperability standards by offering a platform where various
applications can be lined smartly. For example, it is possible to connect the
functions of both Leave Management and Payroll Systems.
How the Apache is Deployed and Set up for the Client and Server-Side
Apache is deployed and set up by
communicating over networks from client to server-side through TCP/IP protocol.
The frequently used protocol is HyperText Transfer Protocol-Secure (HTTP/S) to
define how messages are formatted and further conveyed across the web. In
particular, the protocol defines specific commands for both the client and
server regarding how best to respond to requests. Therefore, HyperText Protocol
Secure usually occurs across port 443 while the unsecured port happens across
port 80. Besides, the Apache server set-up is aided by the configuration files,
where all the applied modules help control the server’s behavior. Indeed,
Apache listens to the configured IP addresses inside the requested configuration
files. Apache can successfullyconsent to precise route trafficswhile being
linked to particular address ports.Since the Apache Listen directive runs
automatically on port 80, it can be changed to run on different ports per the
hosting of many domains. Apache returns acknowledgment notices (ACK) to the
original sender when their messages in the form of data have successfully
reached destinations. In case of an encountered error, while receiving data,
the protocol returns Not Acknowledged notices (NAK), thus confirming the need
Defining Common Operating Processes, Procedures, and Trends for
Managing Apache in a Large-Scale Enterprise Network
With the introduction of Apache
Hadoop as an open-source software platform, it has now become easy to manage
the storage and processing of large-scale enterprise networks. Therefore,
Apache Hadoop comprises the following operating processes as its functional
Hadoop typical – It is a library of utilities relevant
for the operating functions of a Hadoop.
Hadoop Distributed File System (HDFS) – the distributed
file system stores data by providing a high aggregate bandwidth over a network
Hadoop YARN – It is a resource-management platform
associated with the computation of clusters. It helps schedule diverse
applications from the users.
Hadoop MapReduce – It is an operational process for
programming large-scale enterprise networks.
there are two primary components of Apache Hadoop: HDFS and MapReduce parallel
processing frameworks. All these open source projects are developed under the
inspiration of technologies created inside Google.
5. The high-level architecture of Apache web-application
implementation of the web-based application is a precursor for supporting
future growth in the IT industry due to its increased demand, interoperability,
and improved reliability necessities. With web-based application making use of
object-oriented programming in its architecture, the definition of all the
attached functions is as follow.
The delivery of consistent data via HTTP on the
Ensuring that requests contain the most valid data.
Offering complete authentication to users.
Limiting the access of users concerning view-based
Creating, updating, and deleting records.
Apart from that,
the web-based application has undergone tremendous evolution as much as
technology does. The use and creation of service-oriented architecture in
Apache is considered one of the emerging trends that make web-based
applications a service platform. For example, the existence of each HTTP API
has occasioned one facet of codes to procure requests from another part of the
code even though they run on different servers. Additionally, a single-page
application is regarded as an emerging trend in web-based application
architecture. The web is seen as User Interface throughout the application of
while conducting various interactions. Consequently, the user experiences a
more natural appeal coupled with a reduced number of page load interruptions.
Security Concerns for Apache and How IT Managers Ensure Secure
Injection – IT managers would need to control and vet
each user input for possible attack arrays.
Broken authentication – IT managers would execute
multi-factor authentication, thus preventing automated and credentialing cases
coupled with credential attacks.
Sensitive exposure of data –IT managers would need to
take stock, scale down, and further lock up sensitive data.
Broken access control – IT managers would apply
server-less API relevant for discouraging attackers from modifying the access
Security misconfiguration – IT managers would employ a
comprehensive security configuration tool feature that constantly monitors,
resolves, normalizes, and reports cases of misconfigurations.
Cross-site scripting – IT managers would need to hide“prevent
XSS”susceptibilities from showing in an application, and this can be done by
escaping the user input.
Common Web-Based Application and Infrastructure Failures that Impact
the IT Industry
Issues and Network Connectivity. With instances of failed network connectivity
and inefficient firewall, there would be faulty DNS queries. DNS problems coming
from improper network protection cause errors 404s coupled withinappropriate
pathways, which preclude visitors from reaching their destined websites.
However, this problem is amicably resolved by employing DNS observingsafeguards.
Servers and Loading Time. Hosting the websites under a slow server would
negatively impact the general process of analyzing tools in Apache. Therefore,
this issue can be resolved by co-hosting the webs under shared accounts, which
are relatively fast.
Written Code. The common problem associated with the performance of a web
application is the experience of poorly written code. Such a problem of
inefficient codes occasions memory leaks and inappropriate synchronization,
thus leading to deadlock application. Besides, this issue could be resolved by
applying optimal coding practices like the use of profilers and code reviews.
of Load Balancing. Web-based applications might be trailed withsluggish
response times, which originate entirely from poor data load spreading.
Therefore, new visitors are erroneously allotted in the server, leading to
drown out server capacity.Nonetheless, the implementation of tools like NeoLoad
and AppPerfect would help pinpoint the weaknesses in the architecture
infrastructure, leading to high scalability.
Spikes. The execution of a web-based application like Apache is faced with
numerous spikes resulting from promotional videos. Such marketing videos take
up extra traffic, thus causing the servers to slow down. As a result, the
performance of the whole IT industry might be hindered, leading to the reduced
popularity of Apache among its implementers. Nonetheless, the whole traffic
spike can be resolved by setting an early warning system that feigns usercontrollike
of Specific HTML Title Tags. The use of the web-based application is
associated with duplication of specific HTML title tags, causing sites to lose
traffic visibility. For example, many web developers assign similar HTML title
tags, occasioning the search engines to consider such websites duplicates of
the other activity. Besides, one may resolve the issue by cross-checking their
title tags on Google Search Console for duplications or errors.
of Optimizing Bandwidth Usage. Since most developers entirely depend on
local networks while testing their websites, the process of adding videos,
visual, audio, or other high-volume data activities might impact the
performance of the IT industry negatively. Therefore, developers need to
optimize the usage of the bandwidth to conduct a performance boost. Likewise,
server-side HTTP based on the desired resolution and size of the images.
Describing the Alternative System from Competitors based on Strengths
vs. NGINX. With Apache using thread-based structure, their heavy traffics
are encountered that impact performance problem. However, NGINX addresses the
c10k issue. Also, NGINX possesses an event-based architecture that does not
create a new process for each request as much as Apache does. Instead, NGINX
conducts each incoming request in a solo thread. As much as Apache is reliably
secure, it does not manage high traffic more effectively than NGINX. Thus,
Apache is more suitable for medium and small businesses since it is more easily
configured than NGINX due to its numerous modular architecture and
vs. Tomcat. Tomcat is explicitly meant for Java apps, and Apache is usually
purposed for an HTTP server. Therefore, Apache is cross-platform, which can be
subject to other programming languages like Perl, PHP, and Python. Tomcat is
less efficient for serving static webs as compared to Apache. For instance,
Tomcat usually pre-loads the unnecessary java Virtual Machine on websites.
Consequently, Apache is more configurable than Tomcat because one might utilize
a general-objective HTTP server when running WordPress.
Future Expected Projects as Web-Based Application Evolves
With the designation of Apache HTTP
web-server to serve only static web pages, the recent introduction of Apache
Tomcat has evolved the infrastructure and attached module components of this
product architecture. As a result, Tomcat is adapted to serve java
applications, even though there are instances of reduced efficiency. Therefore,
software developers are responsible for developing highly adapted types of
Apache web-server while focusing centrally on upgrading its efficiency during
the configuration phase. In brief, developers need to design web-based applications
with the required traffic to avoid cases of increasing scalability.
Igor et al. “Analysis and Evaluation of Web Application Performance Enhancement
Techniques.” 14th International Conference on Web
Engineering at Toulouse, France, 2014, pp. 1-43. doi:10.1007/978-3-319-08245-5_3
Sarhan, Qusay, and Idrees S. Gawdan. “Web Applications and Web Services: A
Comparative Study.” Science Journal of the University of Zakho, vol. 6, no.
1, 2018, pp. 1-66. doi:10.25271/2018.6.1.375
The Influence of Employee Status On Salary Prediction for IT Firms: Case
of Grant Technologies, Inc.
of employees is critical in advancing a business strategy. Currently,
recruitment processes are drastically changing and becoming highly complex
tasks that require intensive interviews and evaluation. Salary is an important
aspect when it comes to employment(Frost, n.d). It determines whether a given
firm can attract qualified employees. Thus, it influences the quality of
employees that a firm would higher.
software engineering firms, challenges of predicting employee salary are prominent.
A prediction refers to an assumption regarding the future based on existing
knowledge and experience(Frost, n.d). It is essential in helping firms plan
their future. Thus, this paper aims at predicting the salary of employees for
software engineering firms based on different factors. Some of these factors include test scores,
age, years of experience, and gender.
Collection of Data
analysis in this paper seeks to understand if years of experience, test score,
age of the employee, and gender can predict employee salary among software
engineering firms. In an attempt to implement the project, data was collected
from a sample of 30 web developers from Grant Technologies, Inc. An aptitude
test was utilized in collected the test scores for each web developer. Thus,
this study seeks to conduct a multiple regression to assess the validity of the
model and statistical significance of independent variables (test score, years
of experience, age, gender) in influencing the dependent variable’s behavior
(salary). The study defines employees in
terms of age, job experience, test scores, and gender.
regression model was used in achieving the objectives of the paper;
the above regression model was used to assess how age, age, years of
experience, and test score affect employee salary for software engineering
firms. Gender was denoted by 1 (if male), 2 (if female) and 0 (undefined).
The following shows the
results of regression analysis displayed in the regression table;
1: Regression Analysis
Years of Experience
As shown in Table 1,
the intercept score is 334.75 with a p-value = 0.00. The gender, age (years),
test-score, and years of experience coefficients scores 9.35, 1.68, -0.45, and
12.37. As such, this yields the following regression model;
p-value = 0.01
The ANOVA F-test
results are as displayed in the following table;
2: ANOVA F-test Results
p-value = 0.01
Regression analysis model (ANOVA F-test)
The regression analysis
model (ii) shows that if all factors are held constant, web developers at Grant
Technologies, Inc. will earn $334.75. Employee’s age increases employee salary
by 9.35, gender by 1.68, and years of experience by 12.37. However, test scores
have a negative influence on employee salary as it reduces it by 0.45 units. As
illustrated by the R-squared value, these independent variables can predict
58.6% of all the employee salary changes. Thus, the rest is predicted by
variables outside the model. The F–test results show that the calculated value
(F-test Calculated) is 0.00014 against a p-value of 0.01 or 10% significance
level. Since F-Calculated < F-Critical (3.49) or (F-calculated <3.49 at
alpha = 0.01), it implies that we fail to reject the null hypothesis. Thus,
this implies that the model is statistically significant at 10%. The
independent variables can predict 90% of the changes in employee salary.
The study indicates
that age, experience, gender, and test scores have a significant influence on
employee salary. Test scores negatively affect employee salary while the rest
of the variables have a positive influence. Thus, employers must look at these
factors when evaluating the salary of their employees.
Although the model
scores a high R-squared value of 58.6%, it leaves out a significant score of
41.4%. Thus, the current independent variables cannot thoroughly explain all
the changes in the model. There are other factors outside the model that should
be considered that the study did not incorporate. These may include inflation
rate and economic stability, among others.
Frost, J. “Regression Tutorial with Analysis Examples.” Statistics
by Jim, 13 June 2019, statisticsbyjim.com/regression/regression-tutorial-analysis-examples/.
Definition of variables
Age– refers to the age of the employee
Test score– refers to aptitude tests obtained from employee responses
Years of experience– refers to time employee has worked within the same profession
Gender – the particular gender of employee (Male =1, Female = 2, Undefined
Addressing How the Oakland
A’s Used Statistical Methods
Following Lewis (2003),
the inspiration behind Money-ball motivated Bill Beane to come up with his
draft regarding baseball players’ position on specific statistical factors. His
actions entailed imposing an on-base percentage (OBP)coupled with slugging
percentage on Oakland Athletics (Lewis, 2003). Beane went ahead to combine
these two statistical variants into a new tool recognized as on-base plus
slugging (OPS) to impact a fruitful change. Apart from that, Beane’s new
approach did not consider to feature power as an essential factor for
determining a player’s ability to run and score. In his belief, unlike
patience, power is developed over a given period along with player capacity to
be at the desired base (Lewis, 2003).Therefore, he relied on baseball players
from college due to their experience compared to high school ones. The high
school players lacked the stable power that required to maneuver during the
game, implying limited potential to win the game.
Beane adopted Bill James’
sabermetrics ideology as a measure of deciding on his team’s lineup. In this
regard, he chose his line of action that would impact the best opportunities
for winning. For instance, he fully understood that any guaranteed win was
pegged on his team’s possible ability to score more runs compared to other
competing teams (Lewis, 2003). As much as many coaches are accustomed to
placing significant importance on battling averages, Beane considered it
otherwise. Instead, he chose to focus centrally on the order of runs scored per
single game.In this case, it is worth noting that in a given match, the task of
the hitter is creating home runs, doubling, getting on the base and stealing
the bases. Subsequently, the newly derived sabermetrics – OPS – helps measure a
hitter’s success in respect to the number of runs one would create throughout
the match (Lewis, 2003). As a way of assessing the number of created runs by
the hitter, Beane adopted the application of the following formula in his tall
task to success.
(Hits + Walks)
x Total Bases
At-bats + Walks
Precisely, the formula has a success rate
of 90%, where it provides a total of the team’s tangible scored runs occurring within
5% (Lewis, 2003). Therefore, James’ sabermetrics on OBP needs to be
meticulously examined because walks are the critical part of the fashioned runs
Discussing the Importance of Approaching Business
Problems in Innovative Ways
Motivating creativity and exploring innovative ways of
approaching a business has helped improve an organization’s productivity beyond
recognized borders. According to Triady & Utami (2015), innovation is a
critical component for an organization because it enhances its competitiveness.
Adopting an innovation implies that an organization will save time, cost and
other resources along its production line.
Most importantly, the reassurance of employees towards positive and
out-thinking ostensibly offers them a chance to commit time and resources
effectively. Equally, proper utilization of employees’ capability is the first
option towards inculcating innovation in business. For example, employees are
capacitated to explore new innovative meansabout several business solutions’
cost-effectiveness (Triady & Utami, 2015; Lewis, 2003).Therefore, creativity would
advance the much-needed processof solving problems to stay ahead of the
competition. In brief, creative problem-solving brings about a better competitive
edge that any business might adopt torealize a striving success.
(2003). Money-ball: The Art of Winning
the Unfair Game. New York:
W.W. Norton and Company.
Triady, S. M., & Utami, F. A.
(2015). Analysis of decision-making process in Money-ball: The art of winning an unfair game. The
A logarithmic graph
function is an equation of the form that reads y equals to the log of
x, base “b” or y equals the logbase “b” ofx. In all these two equation forms, xand b are higher than
zero while b is not equal to one (Makgakga
and Sepang 78). On the
other hand, an exponential graph relates to an exponent variable as 10 raised to power x. An exponential graph can be expressed approximately by an exponential function (Makgakga
and Sepang 78). Likewise,
it is well described by an exceptionally rapidly increasing exponential growth rate, either
in size or extent.
The association between the graphs of exponential and
logarithmic functions is illustrated in the following ways. The graphs of these
two functions are not the same. Specifically, the logarithmic graph function is the inverse function of an exponential graph function. Therefore,
it implies thata^x = b is an exponential graph function, whereas log base a (b) = xis a logarithmic graph function (Makgakga
and Sepang 81). Their
operations are inverses of one another in the sense that a person can regard
the actual logarithm as
the isolating variable of interest and vice versa when expressing an
Moreover, the inverse of a
logarithmic graph function is a function of an exponential
graph equation. The inverse of an equation is derived by switching the x and y
coordinates so that the graph reflects the line y=x(HELM 4).For instance, the actual exponential
function is dictated by y =
f(x) = ex while the actual logarithmic
graph function is expressed as f(x)
= loge x = lnx(HELM
short, x is considered greater than zero. In summary, the graph function
to the right-hand side of the logarithmic graph curve is a
literal reflection of the exponential graph curve.
Makgakga, Sello, and Percy Sepang. “Teaching and
Learning the Mathematical Exponential and Logarithmic Functions: A
Transformation Approach.” Mediterranean Journal of Social
vol. 4, no. 13, 2013, pp. 77-85.
How the Incorporation
of Green Bonds Impacts Economic Sustainability Globally?
As much as green finance centrally focuses on how a specific
investment can be environmentally friendly, recent studies have stressed the
importance of deriving greater economic sustainability from green bonds. In
this line of thought, investing in green bonds would significantly impact
sustainability due to the following reasons. Green finance not only delivers
additional sustainability but also impacts the economy on a larger scale.
Therefore, the most recent approach to green financing seeks to facilitate the
transformation of the entire process of initiating sustainability by
integrating various sustainable projects. The combination of several green bond
projects would equally lessen the high costs attached to an unsustainable
system’s side effects. With the creation of green bond markets, there has been
assured mitigation of both finance and risk constraints. In this regard, the
introduction of green financing through green bond markets has positively
impacted sustainability by slowing down the growth of environmental
Green Bond Market
With the intensification of environmental degradation
globally, it is essential to invest in green bonds because it seeks to improve
sustainability (Alonso-Conde and Javier Rojo-Suárez 4). In particular, it is now
of high economic advantage to invest in green financial products like bonds.
Investments in green financing have increased across the globe, with many
nations channeling their efforts towards sustainability. For example, the move
towards economic sustainability has been earmarked by massive initiatives in
green bond markets, as illustrated in figure 1 (Alonso-Conde and Javier
Rojo-Suárez 4). The figure expounds on the trends in the primary green bond
market, ranging from 2012 to 2019. The trend has significantly improved up to
2019, with many issuers of bonds diversifying their projects in green financing
(See figure 1). Graph (a) displays the growth in both new issues and market
size in $ billion; it signifies the recent diversification of financial products
to receive green fixed incomes (Alonso-Conde and Javier Rojo-Suárez 4). On the
other hand, graph (b) portrays the geographical distribution of green bond
issuers worldwide, from Germany to the US.
Figure 1. Green bonds market and its
universally geographical diversification (Alonso-Conde and Javier Rojo-Suárez
Illustratively, the graph implies
an increasing appetite for green bonds among the corporate and government
issuers with a 60% growth in 2019 (Alonso-Conde and Javier Rojo-Suárez 4). More
so, graph (b) indicates that the Europeans are leading in the move towards
green financing, with an additional 32% of new issuers across the market. The
US green bond market follows 20% for new issuers, and the Chinese market is
recording a new number of issuers at 7.5% (Alonso-Conde and Javier Rojo-Suárez
4). All these indicators in the growth of an additional number of green bond
issuers happen in 2019. From these illustrated figures, it can be opined that
the green bond market has emerged as a well-established market among investors.
The green bond market has continued to grow exponentially since 2019 because it
receives immense support from the UN-SDG on climate awareness (Alonso-Conde and
Javier Rojo-Suárez 4). Although the market is still young and promising with
developing countries’ dominance, there has been a recent comparative embrace
among other nations like Brazil, Mexico, and China (Alonso-Conde and Javier
Rojo-Suárez 4). Therefore, the emergence of the green bonds markethas
fortunately bolstered cross-regional trades by pinpointing various
international trading opportunities a country might harness.
this juvenile green bond market necessitates increasing regulation from the
relevant governments. The market requires a comprehensive and broad set of
government control since its growth is pegged on the degree of issuance,
supervision, and liquidity (Alonso-Conde and Javier Rojo-Suárez 5).Considering
this kind of bond market as a subject of energy transition when allocating huge
finances, many European nations have increasingly tightened their regulatory
actions. In turn, the ever-growing regulation situation has compelled the need
for a public-private regulatory framework to identify and analyze the
challenging gaps in governance (Alonso-Conde and Javier Rojo-Suárez 5). In the
long run, the government is responsible for optimizing the wide-ranging
interests of both the investors and stakeholders. For example, the European
Commission published an Action Plan on Financing Sustainable Growth in 2018.
The Action Plan is capacitated to offer a comprehensive strategy on improving
the link between finance and sustainability (Alonso-Conde and Javier
Rojo-Suárez 5). Categorically, the action plan encourages the redirection of
capital flows towards sustainable growth, intending to attain inclusive growth.
Also, the action plan is aimed at handling the risks, which result from adverse
climate change, depletion of resources, and degradation of the environment.
More so, the said action plan has equally promoted transparency coupled with a
long-term visualization of both financial and economic activity (Alonso-Conde
and Javier Rojo-Suárez 5). Subsequently, a European Union Green Bond Standard
(EU-GBS) benchmark encourages the financing of low carbon projects.
Background on the Green Bonds and its Government Regulation
Literature on green bonds concentrates wholly on the close
association between its pricing and the attached financial market (UN- ECLAC 2).
In particular, existing literature establishes the relation of fixed income
together with currency markets. However, the authors draw a weak connection
between the green bond and the value of stock and energy. This literature
provides a systematic analysis of the derived returns, liquidity, and
volatility of the green bonds (UN- ECLAC 2). Besides, they offer discussions on
third-party authorization’s primary responsibilities initiated on both personal
and institutional challenges. Third-party authorization is critical for private
issuers because corporate green bonds are regarded as less favorable regarding
volatility and liquidity (UN- ECLAC 2). Furthermore, the corporate bonds might
have higher interest rates as compared to their conventional counterpart.
Therefore, this empirical literature reviews the pricing of the green bond in
the market. For example, the green bonds have a higher comparative
liquidity-adjusted yield premium than the regular bonds (UN- ECLAC 5). With
lower interest rates pegged on the greater liquidity, the green bonds attract
higher daily interest spreads, leading to favorable market prices.
Subsequently, the authors have opined that both the financial and corporate
green bonds trade more often than non-green bonds (UN- ECLAC 5). Most
importantly, the marginal sales of the government-associated green bonds are
more significant due to the size of the issue and maturity. However, the type
of currency does not influence the pricing of the green bonds. Instead, the set
rating on the Environmental Social Governance (ESG) dictates the market prices
of the green bond (UN- ECLAC 5). In this regard, this literature provides an
inquiry on the dynamics of volatility coupled with spillovers attached to the
green bond markets. Thus, government regulation on the green bond market is a
measure of controlling the bond’s sensitivity in respect to its volatility
factor. As much as a considerably more significant part of literature discusses
an investor’s perspectives in the green bond market, talks on the offered
incentives on issuance are significant (UN- ECLAC 6). Primarily, governments
provide financial incentives to issuers; these green bonds decrease the cost of
invested capital through financing, leading to lower risks on the availability
of capital (UN- ECLAC 7). All these measures are relevant to encourage the
issuance of green bonds because they enhance the protection of the environment
by creating value among investors.
Tracking the Additionality of Green Bonds Globally
The primary features of assessing the impact of the green
bond are based on the following two significant yardsticks. First, evaluating
the impact of such an example of a green fixed income depends partly on
establishing whether the finances attached to the bond flows towards a
verifiably green project (Tu, Sarker, and Rasoulinezhad 4). Second, assessing
the impacts of the bonds also relies on the determination of the extracted
sustainable value in respect to the green label project (Tu, Sarker, and
Rasoulinezhad 4). In this regard, it is essential to designate an appropriate
framework for channeling the required green financing towards specific
projects. Such a move would prevent diverting green resources into more minor
“green” investment projects. The reason is that measuring and
tracking the addition of a green bond inside a less “green” project is
difficult (UNEP 10). Thus, this test of addition is underscored on the
possibility of measuring if more significant financial flows would result in
higher sustainability.Furthermore, the incorporation of green financing stimulateshigh
levels of sustainability via reducing the relative costs of financing in extra
sustainable investments (UNEP 11). Thus, green bonds encourage high economic sustainability
by lowering the relative cost of sustainability undertakings.
Sustainability as a Worldwide Transition through the Use of Green Bonds
Achieving green financingrequires a clear comprehension of
both the start and endpoint associated with the direction of sustainability.
Such understanding is gained by having expeditious knowledge on the following
questions concerning the green bonds (Tu, Sarker, and Rasoulinezhad 5).
What a sustainable economy looks like and itsprospective
pathway to sustainability.
What barriers or prospective catalysts impedeor
encourage sustainability vision.
How finances help overcome barriers or activate
The measurement metrics for guiding finance
along the attained green path
The path and sustainability of a
green bond entirely rely on the circumstances surrounding the issuance of such
a bond. For example, the case study of the green bond market in China has
utilized Shanxi and Sichuan energy systems to illustrate the same pathway to
sustainability (Alonso-Conde and Rojo-Suárez 11). Apart from that, there might
be impeding barriers and catalysts, including technological, behavioral,
financial, or political, which negatively impact the market of green bonds. Hypothetically,
the introduction of green financing through government bonds would decrease the
technological cost like the R&D. Moreover, there are assured financial
incentives that catalyze the behavioral change of issuers in a bond market.
Consequently, the provision of government incentives helps reduce capital
costs, leading to the financing of projects that would otherwise be unfinanced
(Sartzetakis 2). Also, lower capital
costs would encourage the government to initiate policy changes on the
regulation of green bond markets. As much as there are many sustainable energy
investments financed using lower costs, the application of green bonds helps
avoid the financial risks associated with the shift to a low carbon system
(Sartzetakis 7). Thus, the shift to low carbon energy entails several expensive
infrastructural changes that impact the economy differently.
Implications of Green
Bonds on Economic Sustainability
Though the application of green bonds helps increase the
number of financed projects, the green bond helps deliver the added value or addition
(Maltais and Nykvist 4). Whereas some pundits argue that the usability of green
bonds does not imply additional value, others have disagreed over the implied
economic benefits. In this regard, the consideration of green financing saves
on the initial costs of capital, thus ensuring that a vast number of projects
are entirely financed (Sartzetakis 13). Therefore, the legitimate concern that
the issuance of green bonds offers a false impression regarding its
sustainability variable is true based on the following analysis. From the
Swedish perspective, it is understood that the actors trade green bonds the
same way they would do to the conventional bonds (Nelson 6). They argue that
the green bonds function just like other financial instruments, and therefore
these bonds have similar environmental impacts on economic sustainability
(Nelson 6). Further, they have constantly opined that they falsely created an
impression about the green bonds is the specific reason why many investors have
pumped large capital bases into novel investments. Practically, there is strong
evidence that indicatesmost investors and issuers have changedtheir interaction
activities in the capital markets (Maltais and Nykvist 8). Indeed, such
alterations in the bond market have occasioned positive effects onmany
organizations’ operation sustainability. Furthermore, numerous similarities
pinpoint the way green bond market and active ownership function among external
stakeholders (Maltais and Nykvist 8). Even though many bonds lack voting
rights, all buyers of these green bonds have engaged in sustainability dialogues.
Consequently, the trading of green bonds only signifies a small percentage of
the bond market; there is an effort among the investors to expand the rising
number of opportunities.
The introduction of green bonds is designed as a familiar
and low-risk financial instrument because its issuance significantly
contributes to economic sustainability at a comparatively low cost. Additionally,
the market requires a comprehensive and broad set of government control since
its growth is pegged on the degree of issuance, supervision, and liquidity. The
ever-growing regulation situation has compelled the need for a public-private
regulatory framework to identify and analyze the challenging gaps in
governance. In the long run, the government is responsible for optimizing the
wide-ranging interests of both the investors and stakeholders. Also, lower
capital costs would encourage the government to initiate policy changes on the
regulation of green bond markets. As much as there are many sustainable energy
investments financed using lower costs, the application of green bonds helps
avoid the financial risks associated with the shift to a low carbon system.
Though many pundits do not consider green bonds responsible for transiting
capital from unsustainable to sustainable investments, green bonds’ trading
offers incentives to issuers. In brief, the provided incentives allow the
issuers to finance any number of projects.
Ana-Belén, and Javier Rojo-Suárez. “On the Effect of Green Bonds on the
Profitability and Credit Quality of Project Financing.” MDPI – Sustainability, vol. 12, no. 6695, 2020, pp. 1-23.
and Bjorn Nykvist. “Understanding the Role of Green Bonds in Advancing
Sustainability.” Journal of Sustainable
Finance and Investment, 2020, pp. 1-20.
“Green Finance in China: Achieving Sustainability through Finance.” Climate Policy Initiative, 2020, pp.
Eftichios. “Green Bonds as an Instruments to Finance Low Carbon Transition.” Economic Change and Restructuring, 2020,
Tu, Chuc, Sarker,
Tapan, and Ehsan Rasoulinezhad. “Factors Influencing the Green Bond Market
Expansion: Evidence from a Multi-Dimensional Analysis.” MDPI – Journal of Risk and Financial Management, 2020, pp. 1-14.
Commission for Latin America and the Caribbean (ECLAC). “The Rise of Green
Bonds.” Financing for Development in
Latin America and the Caribbean, 2017, pp. 1-46.
Bonds: Country Experiences, Barriers, and Options.” UNEP inquiry in Support of the G20 Green Finance Study Group, 2020,
might quote lower bids because they have left out some vital items during the
bid process. During the progress of the project, they might come on board to
ask the contractor for more money to cater for the missed aspects. It implies
that little money is spent when all items are purchased once, but purchasing
fewer items underway might seem more costly. Thus the contractor may end up
spending more than the planned budget. Also, the lowest bidder might have
quoted for materials that are cheap or of lower quality. Such an arrangement
might endanger the safety of the work in the long-run, leading to frequent
accidents at construction sites. Likewise, the lowest bidder might not possess
the adequate skills relevant for the contract, leading to a substandard work
output that really inconveniences the contractor.
electrical sub-contractor must be certified by the National Inspection Council
for Electrical Installation Contracting (NICEIC). Such a body assesses both the
domestic and commercial electricians for enhancing a proper safety of
workmanship. The sub-contractor should be certified by the national body that
checks the installation standards for further affirmation in a bid to control
the required quality. Besides, a better customer service would impact
positively on the reputation of the sub-contractor based on his or her previous
accomplished works (Ramalingam 1). Therefore, a sub-contractor should possess
service attributes like punctuality and reliability, work tidiness, and on-time
task completion to improve the customer service.
quotation during the bid is an ethical breach
that I personally experienced during my first attachment in a certain company.
Categorically, there were reports of corruption from an electrical company
resulting from the sub-contractor’s falsification and overcharging of receipts.
During their bid, the company charged extremely high quotations for their
quotations. The company engineers on-site forged the receipts of payment, and
thus charged the clients higher prices than actual prices. When discovered, the
client authorized the main contractor to terminate the relevant company from
any further contraction work.
Shobha. “Subcontractor Selection Process through Vendor Bids: A Case of an
Outsourcing Service in Construction.” IIM Kozkhikode Society and Management Review,
IIM Kozhikode Society &
IIM Kozhikode Society &
IIM Kozhikode Society &
Balanced Scorecard, and Strategic Profitability Analysis
Scorecard, and Strategic Profitability Analysis
Until recently, performance
measures of a company have been based on evaluation of financial accounting. In
this case, organizations incorporate qualitative and quantitative criteria and
short-term and long-term goals when implementing evaluation performance for the
company (Johansson & Carr, 2018).
Thus, the most preferred approach to attain this is the balanced scored card.
A balanced scorecard can evaluate
employee performance on different quantitative factors by utilizing the
existing qualitative and available financial information. Quantitative measures
emphasize the previous results, primarily as provided in the financial
statements (Johansson & Carr, 2018; Kaplan & Norton,
1993). However, qualitative measures seek to
address the current outcomes on employee activities to evaluate them to help
influence the company’s financial performance in the future (Carey
& Knowles, 2020; Motacki & Burke, 2011).
As such, the subsequent
discussion examines the balanced scored for Limpers Limited. Limpers Limited is
a sales company operating in a competitive environment. Recently, the
management team wanted to evaluate the quality of different strategies undertaken
by the firm. Hence, the subsequent section seeks to discuss the generic
strategies the company uses, understand what comprises reengineering, and
understand the four perspectives of the balanced scorecard. The paper further
aims to analyze the changes in operating income to evaluate strategy and
identify the unused capacity.
Generic Strategies the Company is using.
Generic strategies for a company
refers to the general approaches utilized by the company to position itself in
the industry. In the case
of Limpers Limited, the company has been using various notable strategies. Some
of these are implementing cost-cutting measures to increase its price
competitiveness. Other methods include differentiation and best value
approaches. Although these strategies have helped the company remain effective,
additional procedures are needed to improve its operational efficiency.
Thus, these strategies are explained in the balanced scorecard.
What Comprises Reengineering
Reengineering encompasses the examination and
redesigning of processes and related workflows of a business for an
organization. Notably, business processes refer to a set of work activities
that employees perform for achieving their goals (Wang, National Research Council Canada, &
International Conference on Flexible Automation and Intelligent Manufacturing,
2004; Martin, 2002). Hence, reengineering is performed to enhance
flexibility, responsiveness, efficiency, and effectiveness in the company
The Four Perspectives of Balanced Scorecard
Next is a general
description of perspectives and illustration of Limpers Limited’s measures to
achieve its strategy. These perspectives are discussed as follows.
This perspective aims to evaluate the profitability of
the strategy and the creation of the shareholder’s value. The main strategic
objective for Limpers Limited is to reduce costs concerning competitors and
increase its sales performance. Therefore, the financial perspective focuses on
evaluating the income attained from cost reduction and increased sales.
The perspective seeks to identify targeted customers.
It also identifies market segments besides understanding different measures undertaken
by the company within the respective components (Nejati & Nejati, 2009). Limpers
Limited uses market share and performance measures as ascertained from
communication networks to monitor its customers. Thus, this also helps to
establish new customers and satisfaction ratings for their customers.
This perspective focuses on internal operations that
target creating customer value, thus enhancing financial performance. According
to Limpers Limited, this perspective is determined through benchmarking with
its competitors based on published financial information, current market
prices, customer and supplier feedback, utilization of financial analysts and
experts from the industry (Carey &
The perspective is explained in three primary processes that include
innovation, operational and post-sales processes. Through innovation, Limpers
Limited can create products and services and improve customer services by
invoking innovations in the market (Iqbal, 2019). Equally, the operational process ensures that the
company can produce and deliver customer-focused products. Finally,
post-sales–service processes enhance evaluation of aftersales performance, thus
improving existing activities.
Learning and Growth Perspective
The perspective seeks to identify organizational
capabilities that should be acquired to realize internalprocesses that are
superior. Thus, this enhances the creation of customer and shareholder value (Iqbal, 2019). Thus, the Learning and growth
perspective for Limpers Limited is realized through information system
capabilities, employee capability, and motivation capabilities.
Analysis of Changes in Operating Income
Table 1 below
shows changes in operating income to evaluate strategy for Limpers Limited in
Table 1: Limpers Limited – Balanced Score Card
Increase shareholder valueGrow income
Income from the
growth of revenue Income from production gains
Cost management and utilization of unused capacityBuilding a strong relationship with customers
$ 2, 800,0001,689,000
satisfaction for the customer
Market share across communication platforms, new
customers, and customer ratings
future customer needsIdentify new
93% of customers give high ratings
85% gave high ratings
Internal Business Process Perspective
delivery dates and quality
process for service customersReengineer
order delivery processes
Learning and Growth Perspective
workers to manage the processEnsure
manufacturing processes produce real-time feedback
team of supervisorsImprove offline
and online data collection.
As indicated in Table 1, the firm surpassed its target
performance by realizing financial perspectives from 7.1% to 8.1%, a 1%
increase. Customer base strategies, which included identifying future needs of
the customer and new customers, also supposed the target by 0.8% and 0.1%,
respectively. However, the strategy to increase customer sales focus fell under
the mark. The internal business process perspective strategies significantly
surpassed the target performance. Equally, from the learning and growth perspective,
the firm sought to empower 93% of its workforce and enhance understanding of
the system by 84%. The second
strategy fell below the target performance with only 83% capability enhancement
achieved in the information system.
Some unused capacity includes underutilized employee
skills and experience. The company can manage this through target allocation of
duties and responsibilities based on employee specialization, thus enhancing their performance.
& Knowles, C. (2020). Accounting: A smart approach.
& Burke, K. (2011). Nursing delegation and management of patient
& Nejati, M. (2009). Global business and management research: An
International Journal Vol.1 No. 2. Universal-Publishers.
National Research Council Canada, & International Conference on Flexible
Automation and Intelligent Manufacturing. (2004). Proceedings of the 14th
International Conference on flexible automation and intelligent manufacturing.
Vol. 2. Intelligent manufacturing. NRC Research Press.