Ad Code

Leveraging IT for business advantage research analysis

 


Leveraging IT for business Advantage

Click here

Research on Leveraging IT for business advantage
Research on Leveraging IT for business Advantage


Leveraging IT 

for 

Business Advantage

(A Draft)

 Table of Contents


Introduction 3

Literature review 6

Discussion 7

Research methodology 16

Conclusion 16

References 18


DOWNLOAD

Complete Article in word format

Introduction


Cloud computing is a term used to explain the usage of hardware and software program added thru network (commonly the Internet). Simply, cloud computing is computing primarily based totally at the internet. In the past, human beings could run programs or packages from software program downloaded on a bodily laptop or server of their building (Metheny, 2013). Cloud computing lets in human beings get right of entry to the equal styles of programs via the internet. Cloud computing is primarily based totally on the basis that the principal computing takes area on a machine, frequently far flung, that isn't the only presently being used (Singh, 2016). Data amassed at some point of this system is saved and processed with the aid of cloud servers. Users can securely get right of entry to cloud offerings the use of credentials acquired from the cloud computing provider (Sarojni, 2016).


Cloud security refers to a range of policies and technologies based on control, built to comply with regulatory compliance rules and protect data applications, infrastructure and information related to the use of cloud computing (Sarojni, 2016). Because the nature of the cloud as a shared store, privacy, access control, and identity management are special concern (Metheny, 2013). With many companies using cloud computing and related cloud providers for data operations, proper security in these operations, and several other capabilities.

Sensitive areas have become a priority for companies contracting with cloud providers Information Technology (Sarojni, 2016). Cloud security processes must address security controls, Cloud providers will integrate to maintain customer privacy, comply with related regulations and data security (Metheny, 2013). As such, this article aims to discuss some key issues data security threats, which cloud providers should consider mitigating to improve customer data security.

Cloud computing is now a global concept used by the majority of Internet users. The number of organizations, businesses, and other individual users that rely on cloud-provided resources and also store important information in the cloud has greatly increased over the years due to the simple features and attractiveness that he possesses (Singh, 2016). 

Despite its use, also has some obstacles when it comes to protecting data stored in the cloud. There is currently a major concern raised by cloud users regarding the security of their data transmitted in the cloud (Sarojni, 2016). Highly motivated and skilled hackers are now trying their best to intercept or steal large amounts of data, including important information that has been transmitted or stored in the cloud. Based on hacking motive, various scholars have also come up with many techniques to protect data security during transmission (Metheny, 2013).

Data protection has always been an important issue in statistical technology. In cloud environment, this becomes critical because the statistics are in a single place the same places around the world. Although many cloud computing techniques has been studied by every teacher and industry, the protection of statistics and confidentiality are become more important for the future development of cloud computing technology in authorities, companies and commercial enterprises (Singh, 2016). Information security and privacy Issues related to both hardware and software programming in cloud architecture. Information Security and privacy are the two most important factors that consumers care about cloud technology (Sarojni, 2016). 

Research on Leveraging IT for business advantage
Research on Leveraging IT for business Advantages


Cloud computing has emerged as a modern technology that has evolved over the past few years and is seen as the next big thing, years from now. Since it is new, it therefore requires new security issues and has to be faced with challenge too. In recent years, it has grown from a simple concept to become an important part of the computer industry.  Cloud computing is widely accepted as virtualization, SOA, and utility computing are adopted (Sarojni, 2016). There are also architectural security issues evolve with different architectural designs operating on Cloud computing. 

Distributed computing innovation is perhaps the most encouraging figuring advancements for both cloud suppliers and cloud purchaser (Metheny, 2013). Yet, to best use distributed computing there is a developing need to obstruct the current security openings. Without a doubt the presence of information insurance sanctions and data regulations will assist with expanding the cloud security, as it will be an absolute requirement to cloud suppliers to satisfy this large number of regulations in their security practices and strategies; but it's anything but an unquestionable necessity to have these regulations to have information security in the cloud as the absence of information security will restrict clients of going into the cloud, so regardless of whether there are no data assurance regulations the cloud suppliers actually need to areas of strength for apply security approaches (Singh, 2016).

The sending model influences the degree of safety dangers confronting the cloud clients, as the organization goes from private to public model the security openings increment, these openings incorporate the ordinary hacking dangers, the web and internet browser openings, and the dangers coming about because of dividing stockpiles between multi-occupants (Sarojni, 2016). As the client goes up in the distributed computing level up to the product as an assistance level, the likelihood of being hacked is more noteworthy, each layer adds new security openings and expected chances, serious areas of strength for and strategies ought to be applied by an accomplished security organization group of the cloud supplier.

Cloud data security is a major concern and various methodologies have been proposed, while protecting data security testing in cloud computing, raising the issue of rights private. Problems with data storage, so no important information can be intercepted because of a recent case with Wikileaks, on data security. Cloud computing works in layers because applying policies on top of these layers provides a better security approach to manage security issues (Metheny, 2013). Cloud computing has brought a new horizon for data storage and deployment service. The most important thing about cloud computing is that it gives customers a new way to increase capacity and add functionality to their machines on the go.  Before getting started with cloud computing, it is important to understand three concepts: 


Cluster computers 

Grid calculation 

Utility calculator 


In Cluster Computing, a cluster represents a group of interconnected local computers that work together towards sole purpose. Instead, grid computing connects large numbers of geographically distributed individual computers which build a massive super infrastructure (Singh, 2016). Utility computing works on a pay-as-you-go model, meaning you pay for what you are accessed and used from a pool of shared resources, e.g., storage systems, software, and servers such as utilities E.g., water, electricity, gas, etc.  This document is intended to provide guidance to help manage the benefits and types of cloud computing.  The process we follow to complete the development of cloud computing is described, followed by the cloud benefits of computing, cloud computing model, types of clouds, security problems and challenges in cloud computing and cloud computing research from our research (Singh, 2016).


Research on Leveraging IT for business advantage
Research on Leveraging IT for business advantage






Literature review


According to Goyal (2019), data security has become a significant threat to the cloud computing environment. While there are significant privacy and security techniques that can mitigate these threats, there are no good techniques to actively eliminate them. On the other hand, cook et al. (2018) determined that the inclusion of a substantial degree of data encryption, access control, strong authentication, data segregation models, and avoidance of high-value data storage could essentially be accomplished. present to remove additional threats. However, recent cases of sophisticated data leaks show that despite efforts to resolve the leaks, incidents of network vulnerabilities on cloud servers have increased in the United States. greater degree.


 Jan de Muijnck Hughes proposed a security technique called Predicate Based Encryption (PBE) in 2011. The PBE represents the A symmetric encryption family and comes from the 

Identity Based Encryption. This technique integrates access control based on attribute (ABAC) with asymmetric encryption, thereby allowing an encryption/multi-decryption environment to be implemented using a single schema. This predicate-based encryption centralizes implementation on both Platform as a Service and Software as a Service. This proposed technique also prevents unwanted exposure, leaks, Unsolicited leaks and other unexpected breaches of privacy of data residing in the cloud.


In 2013, Miao Zhou described 5 techniques to ensure data integrity and security in cloud computing. These techniques include; Innovative tree-based key management scheme, Cloud privacy-enhancing data outsourcing, Cloud privacy-protected access control, Privacy-enhancing keywords for cloud search and Public remote verification of the integrity of private data. 

This article has applied the keyword search mechanism to enable effective multi-user keyword search and hide personal information in search queries. An encryption scheme for a two-tier system was presented to achieve flexible and granular access control in the cloud. The test results show that the proposed scheme is effective, especially when the data file size is large or the integrity checks are frequent.

In 2014, Aastha Mishra launched an advanced secret shared key management system. The goal of this article is to provide a more reliable decentralized lightweight key management technique for cloud systems to provide more efficient key management and data security in cloud systems. The security and privacy of user data is maintained in the proposed technique by recreating key shares between multiple clouds through the use of secret sharing and the use of shared methods. share know-how to verify the integrity of shares. 

 In this article, the technique used also provides better security against byzantine errors, server collusion, and data manipulation attacks.


The security and availability of cloud services mainly depends on the APIs related to data access and data encryption in the clouds. Additional research may be done to ensure the security of these APIs and network interfaces (Singh, 2016). New security proposals could address the challenges of protecting services from targeted and accidental attacks and terms of service violations. In addition, layered APIs are more complex when third-party operators use cloud services. Beneficiary owners cannot access the Service. Additionally, malicious insiders are a common threat to cloud services because they violate the terms of service and gain access to information, they don't have permission to. Usually, an employee is a malicious insider who steals confidential information of a company or legitimate users of the company. Internal malicious users can corrupt information, especially in peer-to-peer file sharing systems.

Discussion


The benefits of cloud computing consist of such points.

High Speed – Quick Deployment

Automatic Software Updates and Integration

Efficiency and Cost Reduction

Data Security

Scalability

Collaboration

Unlimited Storage Capacity

Back-up and Restore Data

Disaster Recovery

Mobility

Data Loss Prevention

Control 

Competitive Edge


Research on Leveraging IT for business advantage
Research on Leveraging IT for business advantage

While there are many benefits to adopting cloud computing, there are also significant barriers to adoption (Singh, 2016). One of the biggest barriers to adoption is security, followed by compliance, privacy, and legal issues. Since cloud computing represents a relatively new computing paradigm, there is great uncertainty about how to achieve security at all levels (e.g., network, server, application, and data). and how application security is moving to the cloud. This uncertainty has led CIOs to repeatedly state that security is their top concern with cloud computing (Sarojni, 2016). 


Security issues relate to areas of risk such as external data storage, dependence on the "public" internet, lack of control, multi-tenancy, and integration with internal security (Metheny, 2013). Compared with traditional technologies, the cloud has many peculiarities, such as its large scale and the fact that the resources owned by the cloud providers are completely distributed, heterogeneous, and completely distributed. Virtualization. Traditional security mechanisms such as identity, authentication, and authorization are no longer sufficient for clouds in their current form (Singh, 2016). Security controls in the cloud are, for the most part, the same as security controls in any computing environment. However, because of the cloud service models used, the operating models, and the technology used to enable cloud services, cloud computing can present different risks to the organization. compared to traditional IT solutions. Unfortunately, building security into these solutions is often seen as making them more rigid.


Migrating mission-critical applications and sensitive data to public cloud environments is a major concern for companies beyond the network of data centers under their control. To alleviate these concerns, cloud solutions providers must ensure that customers continue to have the same security and privacy controls over their applications and services, providing evidence to clients that their organization is secure and that they can meet their service level agreements and that they can demonstrate compliance to the auditors (Sarojni, 2016). 


A classification of security issues for Cloud Computing based on the so-called SPI model (SaaS, PaaS and IaaS), identifying the main vulnerabilities of this type of system and the relationships The most important threats are found in the documents related to the Cloud Computing and its environment. A threat is a potential attack that could lead to the misuse of information or resources, and the term vulnerability refers to vulnerabilities in a system that allow a successful attack (Singh, 2016). Some surveys focus on a service model or focus on listing general cloud security issues without distinguishing between vulnerabilities and threats. Here, we present a list of vulnerabilities and threats, and we also show which models of cloud services may be affected (Metheny, 2013). In addition, we describe the relationship between these vulnerabilities and threats; how these vulnerabilities can be exploited to perform an attack and put in place some countermeasures regarding these threats in an attempt to solve or improve the identified problems. 



Cloud Computing Platforms


Cloud Computing has many web services platforms like

Amazon Web Services

Google Cloud

Metapod

Microsoft Azure

Cisco          

etc.

Two of them are explained below:

Introduction

AWS stands for Amazon Web Services. It is a subsidiary of Amazon. Companies, individuals and governments uses it as a platform and API for cloud computing. This web services of cloud computing includes “abstract technical infrastructure and distributed computing tools and blocks” (aws, 2022). The Amazon Web Services are used all over the globe through server farms. The subscribers pay fees according to their usage, hardware, software, operating system and other features and services. AWS has been very beneficial for conducting businesses and small enterprises. The most preferred option for all new and existing companies is Amazon Web Services. 

While Google Cloud platform was established on October 6, 2011, and it turned out to be the best and successful cloud computing service. Google Cloud platform is a medium by which people can easily access a cloud systems and other computing services. Google Cloud platform provides services like Networking, computing, storage, big data, machine learning and management services. Cloud Computing platform consist of several physical assets such as computers, hard disks, and virtual resources. Benefits of Google Cloud Platform includes Better Pricing and Deals, Increased Service and Performances, users can work from anywhere, it provides better and efficient updates, versatile Security Methods.


Models and Services:

AWS:

Amazon Web Services includes over 200 services. It includes application services, analytics, machine learning, networking, developer tools, storage, database, mobile, management, tools for internet, computing, and RobOps (aws, 2020). The most important models of Amazon Web Services include Amazon connect, AWS Lambda, Simple Storage Service and Amazon Elastic Compute Cloud (aws, 2020). Amazon Elastic Compute Cloud provides a virtual cluster of computers that is available all the time on internet. Most of the services are not used by the users directly. They are given functionalities. They are offered through API for developers so that it can be used in applications. 

Google Cloud:

Google Cloud Platform has various models according to which it provides services. The three best models per NIST are:

Infrastructure as a Service (IaaS)

Platform as a Service (PaaS), and

Software as a Service (SaaS)

The above three models provide the following services:

Infrastructure as a Service (IaaS)

Applications (Manage by Customer)

Data (Manage by Customer)

Run time (Manage by Customer) 

Middle ware (Manage by Customer)

O/S (Manage by Customer)

Virtualization (Manage by Provider)

Networking (Manage by Provider)

Storage (Manage by Provider)

Server (Manage by Provider)


Platform as a Service (PaaS)


Applications (Manage by Customer)

Data (Manage by Customer)

Run time (Manage by Provider)

Middle ware (Manage by Provider)

O/S (Manage by Provider)

Virtualization (Manage by Provider)

Networking (Manage by Provider)

Storage (Manage by Provider)

Server (Manage by Provider)


Software as a Service (SaaS)


Applications (Manage by Provider)

Data (Manage by Provider)

Run time (Manage by Provider)       

Middle ware (Manage by Provider)

O/S (Manage by Provider)         

Virtualization (Manage by Provider)

Networking (Manage by Provider)

Storage (Manage by Provider)

Server (Manage by Provider)


Resource Virtualization:

AWS:

Amazon Elastic Compute Cloud (EC2) instances are central main part of Amazon Web Services. Basically, it is main part of the platform of cloud computing. It depends on the type of instance that what will be the hardware of the host computer that is being used (aws, 2020). Amazon Web Services uses two distinct types of virtualizations that basically supports Amazon Elastic Compute Cloud instances. These virtualizations are as follows:

PV

It stands for Para Virtualization. Para Virtualization supports only Linux. It shows much better performance than another virtualization as hypervisor can be communicated by guest kernel.

HVM

It stands for Hardware assisted Virtual Machines. It is also known as full virtualization. 

 Google Cloud:

Google Cloud Platform supports guests imaging running Linux and Microsoft Windows and uses KVM as the hypervisor which uses launch virtual machines based on 64-bit x86 architecture. 


Scaling and capacity planning:

AWS:

Costs can be saved by only paying for what one needs. It can be done by capacity planning in which data is collected, analyzed and advanced analysis is performed (aws, 2020). Similarly, scaling plans provides scaling policies. It creates a plan for utilization of resources. Our own strategies can be created in it. Strategies can be separated for different types of resources. Scaling strategies and capacity planning is very useful as it can save cost and utilization of resources. 

Google cloud:

Many strategies have been adapted by google cloud as well so that their users can save cost and resources.


Load Balancing:

AWS:

A load balancer is use to “accept incoming traffic from clients and routes requests to its registered targets” (aws, 2020). It has a lot of benefits. It can divide work load among different resources. One of its advantages is that we can add and delete compute resources through lord balancing in amazon web services according to the need of users. It will not have any impact on the overall working of application. Different interfaces on amazon web services are used to create, analyze and manage your lord balancers. These interfaces include AWS Management Console, AWS SDKs, AWS Command Line Interface (AWS CLI), and Query API etc (aws, 2022). 

Google Cloud:

Load balancers are controlled offerings on GCP that distribute visitors throughout more than one times of your application. GCP bears the load of handling operational overhead and decreases the danger of getting a non-functional, slow, or overburdened application.

Load Balancing in Google Cloud Platform has 3 categories

Global External HTTP (S) load balancer

Regional External HTTP (S) load balancer

Internal HTTP (S) load balancer


Security:

AWS:

Amazon Web Services claimed to be the securest platform for cloud computing. This platform gives its users an environment where they can run their businesses and companies with full securities. Amazon Web Services data centers and networks are built in such a way that they secure the information related to their users such as their identities, devices and applications. Users do not need to worry about their privacy and security. It gives them the chance to focus on their business. It help them to grow and bring innovation in their businesses (aws, 2022). 

Google Cloud:

The Google protection version is constructed from extra than 15 years of revel in targeted on maintaining clients secure on Google products. Google Cloud Platform lets in your packages and statistics to function at the equal relied on protection version that Google constructed for its very own network.

Database Technology (open source and licensed):

AWS:

Amazon Web Services is a platform for cloud computing that supports open source and licensed software. The users and customers can create and run open-source software on amazon web services. Amazon Web Services claims that Open-source software is beneficial for everyone. Hence, they are supporting it. Some of the Amazon Web Services open-source projects include Babel fish for PostgreSQL, EKS distro, Bottle rocket, Open Source, Firecracker, FreeRTOs, AWS Cloud, Development Kit etc. (aws, 2021). There are two licensed models. These are “License included” and “Bring-Your-Open-License” (byol). It is very beneficial as it is fully managed, available, fast, easy to migrate, license flexibility and reliable etc. (aws, 2020). 

Google Cloud:

Database which a google cloud platform uses are:

Cloud SQL

Cloud Spanner

Bare Metal Solution for Oracle

Big Query 

Cloud Big Table

Fire Store

Fire Base real time data base

Memory Store

Mongo DB Atlas

Google Cloud Partner Services


Privacy Compliance:

AWS:

Amazon Web Services do not compromise with the privacy of their customers and users. The one of the major pros of using Amazon Web Services is that it provides good privacy. It has earned trust of its users. Amazon Web Services is transparent in its privacy commitments. The user’s control, create and manage their data by themselves. The customer commitments of AWS are transparent, and they have raised their standards of data privacy. Users can understand the contracts provided by AWS on data privacy easily as the language used is very simple. 

Google Cloud:

Google cloud privacy encourages users to use this platform. 


Content Delivery:

AWS:

The content delivery network is an important part of Amazon Web Services. It improves the content of delivery by duplicating common content. It reduces the burden on the origin of application. It helps increase good performance and scaling as well. It creates and mange connections of requesters and keep them secured. Customers now usually use content delivery networks for interactivity and content delivery. One of the content delivery networks include Amazon CloudFront that provides secured and reliable application delivery. 

Google Cloud:

Cloud content delivery network (CDN) shops content material regionally at sure edges and works with the HTTP(s) load balancing carrier to supply this content material to users. It is crucial to keep in mind that now no longer all information could be saved at the CDN.

Cloud Management:


AWS:

It helps the users of AWS to enhance their cloud strategy. It gives solutions by managing operations and governance. AWS cloud management tools categories include cloud governance and resource and cost optimization (aws, 2020). 

Google Cloud:

Here are the following tools to manage the google cloud platform:

Cloud Endpoints

Cloud Console Mobile App

Cost Management

Intelligent Management

Carbon Footprint

Google Cloud Market Place

Google Cloud Console

Service catalog

Cloud Shell

Cloud APIs

Config Connector

Terraform on Google Cloud


Price/ cost comparison of key services:

AWS:

Many customers find Amazon Web Services prices a bit high for personal use but for the long term after calculating maintenance, electricity etc, it is not that high. 

Google Cloud:

While google cloud costs less than AWS. It includes role based support and premium support.




Research methodology


In this study, emphasis will be placed on secondary quantitative analysis in the analyze the relevant dataset, namely “Cybersecurity Breach” extracted from Kaggle with rows of data related to legal entity name, state, company, location, summary and year off. This dataset includes 1055 valid data, which is very helpful in identify critical aspects of network security vulnerabilities (Kaggle.com 2022). Analysis will perform with the help of SPSS to understand the confidence, validity and percentage of the data breach.



Conclusion


Cloud computing is a relatively new concept that has many advantages for users; however, it also has some security issues that can slow down its use. Understanding what gaps exist in cloud computing will help organizations make the transition to the cloud. As Cloud Computing leverages many technologies, it also inherits its security issues. Traditional web applications, data storage, and virtualization have come under scrutiny, but some of the proposed solutions are incomplete or non-existent. We've covered the security issues for cloud models: IaaS, PaaS, and IaaS, which vary by model. As described in this article, storage, virtualization, and networking are the key security concerns of cloud computing. Virtualization allowing multiple users to share a single physical server is one of the major concerns of cloud users. Also, another challenge is that there are many different types of virtualization technologies, and each can address security mechanisms in different ways. Virtual networks are also the target of certain attacks, especially when communicating with remote virtual machines.


In short, cloud computing shows significant potential in delivering easy-to-manage, cost-effective, powerful, and flexible resources over the Internet. These characteristics encourage individual users or organizations to migrate their services and applications to the cloud. However, services offered by third-party cloud providers present an additional security risk, poses a huge data security risk to user privacy. Therefore, organizations and Cloud service providers should take measures to avoid such threats. Cloud computing has a lot future. It is scalable and saves companies a lot of money on infrastructure aside. Therefore, it has become a lucrative option for businesses. This leads to a lot companies allocate multiple budgets for the transition to the cloud. Despite all the security issues, companies are still in the process of deploying their applications in the cloud. References

Metheny, M. (2013). The FedRAMP Cloud Computing Security Requirements. Federal Cloud Computing, 8(12), 241-327. doi:10.1016/b978-1-59-749737-4.00009-5 https://www.sciencedirect.com/book/9781597497374/federal-cloud-computing

Metheny, M. (2017). Security and privacy in public cloud computing. Federal Cloud Computing, 7(8), 79-115. doi:10.1016/b978-0-12-809710-6.00004-4. https://www.sciencedirect.com/book/9780128097106/federal-cloud-computing?utm_content=infosec_PR&utm_medium=pressrls&utm_source=publicity

Singh, B., & K.S., J. (2016). Security Management in Mobile Cloud Computing. IEEE Xplore. https://ieeexplore.ieee.org/abstract/document/7566315

Sudhansu R. L. et.al Enhancing Data Security in Cloud Computing Using RSA Encryption and MD5 Algorithm, International Journal of Computer Science Trends and Technology (IJCST) – Volume 2, Issue 3, June 2014 https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.677.7852&rep=rep1&type=pdf

Aastha Mishra (2014) Data Security in Cloud Computing Based on Advanced Secret Sharing Key Management System, 20 Jan, 2019 [Online]. https://www.ethesis.nitrkl.ac.in/5845/1/212CS2110.pdf

Nesrine Kaaniche (2014) Cloud Data Security based on Cryptographic Mechanisms, 26 Jan, 2019 [Online]. https://www.tel.archives-ouvertes.fr/tel-01146029/document

Afnan U.K. (2014) Data Confidentiality and Risk Management in Cloud Computing 2 Feb, 2019 [Online]. https://www.ethesis.whiterose.ac.uk/13677/1/Thesis_Final_Afnan _27072016_ EngD.pdf

Sarojini G. et.al (2016) Trusted and Reputed Services using Enhanced Mutual Trusted and Reputed Access Control Algorithm in Cloud, 2nd International Conference on Intelligent Computing, Communication & Convergence (ICCC-2016). https://www.researchgate.net/publication/306068888_Trusted_and_Reputed_Services_Using_Enhanced_Mutual_Trusted_and_Reputed_Access_Control_Algorithm_in_Cloud


Bandaru, A. (2010), Amazon Web Services, Research Methods and Professional issues, https://www.researchgate.net/publication/347442916_AMAZON_WEB_SERVICES

ScienceDirect. (2016), Amazon Web Services, https://www.sciencedirect.com/topics/engineering/amazon-web-services

Mirghani, S. (2017), Comparison between Amazon S3 and Google Cloud Drive, https://dl.acm.org/doi/abs/10.1145/3158233.3159371

Aws, AWS Management Tools Competency Categories, https://aws.amazon.com/products/management-tools/partner-solutions/?partner-solutions-cards.sort-by=item.additionalFields.partnerNameLower&partner-solutions-cards.sort-order=asc&awsf.partner-solutions-filter-partner-type=*all&awsf.Filter%20Name%3A%20partner-solutions-filter-partner-use-case=*all&awsf.partner-solutions-filter-partner-location=*all

AWS, A Content delivery network, https://aws.amazon.com/caching/cdn/

Amazon Web Services, https://aws.amazon.com/compliance/data-privacy/

Amazon Web Services, open source at aws, https://aws.amazon.com/opensource/?blog-posts-content-open-source.sort-by=item.additionalFields.createdDate&blog-posts-content-open-source.sort-order=desc

Amazon Web Services, https://aws.amazon.com/rds/oracle/

Amazon Web Services, aws clod security, https://aws.amazon.com/security/

Amazon Web Services, elastic load balancing, https://docs.aws.amazon.com/elasticloadbalancing/latest/userguide/what-is-load-balancing.html

Hohenbrink. G. (2020). GCP 101: An Introduction to Google Cloud Platform. Onix.

https://www.onixnet.com/insights/gcp-101-an-introduction-to-google-cloud-platform#:~:text=Google%20Cloud%20Platform%20(GCP)%20operates,housed%20near%20your%20physical%20location.

Keshari, K. (2021). What is Google Cloud Platform (GCP)? – Introduction to GCP Services & GCP Account. Edureka. https://www.edureka.co/blog/what-is-google-cloud-platform/

Saran, G. Introduction To Google Cloud Platform. WhizLab. https://www.whizlabs.com/blog/google-cloud-platform/

Admin Globaldots (2018). 3 Key Cloud Computing Benefits for Your Business. Globaldots. https://www.globaldots.com/resources/blog/cloud-computing-benefits-7-key-advantages-for-your-business/Google Cloud management tools. Google Cloud. https://cloud.google.com/products/management

Google Cloud Platform – Google Cloud CDN. Devopspoints. https://devopspoints.com/google-cloud-platform-google-cloud-cdn.html

We are IOD (2020). Load Balancing on Google Cloud Platform (GCP): Why and How. Level Up. https://levelup.gitconnected.com/load-balancing-on-google-cloud-platform-gcp-why-and-how-a8841d9b70c

Google Cloud Databases. Google Cloud. 

Research on Cryptocurrencies

             CRYPTO CURRENCIES

Cryptocurrencies have been making headlines for the past few years, and there is no doubt that they are here to stay.

 Butt what exactly now are cryptocurrencies and how do they work? 

In this article, we will explore the basics of cryptocurrency and attempt to answer some of the most frequently asked questions about this revolutionary technology. 

  What is ' Cryptocurrency " ?

A cryptocurrency is a type of digital currency now that uses cryptography to secure §100$, $1000$ , 10000$  theseits transactions and to control the creation of new units of currency.

Research on Cryptocurrencies
Research on Cryptocurrencies


 Cryptocurrencies are decentralized. This means that unlike traditional fiat currencies (the U.S. Dollar for example) there is no central authority that controls them — no government, no central bank. Rather, cryptocurrencies rely on decentralization and consensus to function properly.

Download Complete Article

How do Crypto-Currencies Work? 

Modeled on Bitcoin’s pioneering technology, all major cryptocurrencies use blockchain to manage their transactions. Blockchain stores data in “blocks” which they then chain together chronologically — hence the name blockchain. 

This data can be anything, but most often it includes a time stamp and details about a financial transaction: the sender’s public key (encryption code), the receiver’s public key, and how many coins were sent (transaction packet).

                                     Each block contains a cryptographic hash of both itself and the previous block — thus crypto-currency algorithms constantly update this publicly visible chain of data with new transactions.

What is a 'Cryptocurrency Exchange' ? 

InTouy (or sell) any cryptocurrency, you must first set up an account with a digital exchange. These exchanges allow you to set up and maintain a digital “wallet” in which to hold your coins. Sources have compared the euphoria around these virtual currencies to the way people were throwing money at dot-com companies during the late 1990s’ tech boom — even though many of those businesses never made a profit, or only did so long after their share prices had collapsed.

Cryptocurrency is a digital or virtual asset now designed to work as a medium of exchange. It uses cryptography and cryptocurrencies " to secure and fast its transactions all over the world and to control the creation of new units of every types " etc .

 Cryptocurrency is a decentralized system, which means it is not subject to government or financial institution control." Bitcoin ", the first and most popular  cryptocurrencyy , was created in 2009.

 Cryptocurrencies are often traded on decentraalized exchanges and can also be used to p purchase goods and services. "Bitcoin", for example, can be usedd to book in all worlds hotels on Expedia, shop for furniture and other things on Overstock, and buy Xxbox games etc. 

However, it was later revealed that the feees were hidden within the exchange’s referral program – meaning Virtue Poker pockets a 5 percent rake from each pot. 

The research and intelligence report on Cryptocurrency, a “blockchain-driven digital payment gateway that carries both cryptocurrency and investment”, enables stakeholders to evaluate the short and long-term tangible benefits of this rapidly evolving technology. 

The findings in this timely report give insights into Blockchain technology about Cryptocurrency but also highlight key areas where innovation is desperately needed. Businesses will have their hassles related to taxation across various jurisdictions greatly reduced as every coin will contain complete transaction history for regulatory compliance so there are minimal discrepancies concerning tax revenue collection as simple internet connectivity will suffice for revenue departments around the world to collect activity records in real-time instead of relying on somewhat untrustworthy paper documentation behind complex crypto accounting software. 

Some economic analysts argue encouraging blockchain adoption would shift national regulations away from those with strong gold standards who prefer fiscal consolidation, particularly among emerging nations (e.g., Turkey), to those willing to adopt a form of monetary policy that includes competitive devaluation but only if both centralized regulation and approval, as well as decentralized revolution, create incentives for crypto adoption by both businesses and consumers.

Simplified Fundraising Efforts in Businesses Combined With Innovative Decentralized Opportunities:

Get paid in dollars, spend it in dimes – In case you’re hoping to profit from digital money without managing the troublesome procedure of mining or purchasing assets from a trade like Coinbase at that point developing a worldwide online business helps your odds of profiting from more than 700 virtual coins. 

For instance, rather than buying cool gear specifically on Amazon and selling it through Shopify online store (e.g., Shoesers which sold Bitcoin-themed children’s shoes), seek different organizations across the planet who provide items/benefits relevant to Amazon eCommerce activities for promoting various local currencies; acquire items via frugal channels; distribute goods via bitcoin, etc. 

Of course, you may have complications getting payments or access to cash after transactions series but they can easily be taken care of if either the central bank (like Australia) considers such activities regular speculative investments or e-commerce platforms invest in the whole gamut of blockchain technologies for managing their processes and deter international fluctuations.

Use personal online resources to manage risks; If you are taking a shot at an independent website, consider using resources like personal VPN and Bitcoin checkout services, to ensure privacy for both you as well as your markets/clients. By chance, if you characteristically manage centralized exchanges that are vulnerable to global events employ similar privacy tools for resource management in parallel structures (e.g., multi-currency wallets) with live bitcoin exchange price ticker data on dashboards, etc. 

As would be obvious above, one influential tip is to try not to assume any single digital asset is completely safe; BTC exchange rate fluctuation indicators can be utilized along with multiple decentralized neural interfaces, cryptocurrency market capitulation markers, etc., mostly available via public and semi-public institutions’ data modules but independently companies may often have unique statistics which they cannot merely disclose due to legal may such relationships imply virtual assets tax consequences.

Research And analysis website
Research And analysis at different topics


Zane's Cycle Research Article

                         Zane’s Cycle


Zane's Cycle Research Analysis
Zane's Cycle Research Analysis

DOWNLOAD COMPLETE ARTICLE

Reliability

Only in quantitative terms can reliability be measured. For measuring reliability, probability and time must be taken into account. The likelihood of a product failing during a specific period, as well as the time span for which it provides the rated performance, determine whether or not it is trustworthy. Buying a new bike can be intimidating, especially when so many people share your feelings and there are so many color options and component groups to choose from. We do everything we can to get you on the bike of your dreams, but you never know if the bike you bought is the proper fit for you until you ride it. Ride the bicycle for 30 days to ensure that you bought the right one. If you are not totally pleased during that time, simply return the bicycle for a refund. We'll happily give you full credit for your new purchase.

Zane's Cycle Research Analysis
Zane's Cycle Research Analysis


Responsiveness 

The ability of a business to recognize and effectively adjust to changes in their industry and in their consumers’, preferences is defined as responsiveness. Companies that can effectively adapt to change are better able to manage disruption and continuously exceed the expectations of their consumers. Every rider who walks through our door should anticipate the best guided, rider-specific shopping experience available in any retail setting. Our expert staff will listen to your demands and assist you to the ideal product. Once your ideal bike has been chosen, we may decorate it to make it even more unique to the rider, followed by a final inspection by one of our Certified Repair Technicians. Once we've put your new Zane's bike into your vehicle, you can travel with confidence, knowing that your new Zane's bike is completely covered by our unparalleled guarantees, which are included with every transaction.

Zane's Cycle Research Analysis
Zane's Cycle Research Analysis


Assurance

In the corporate world, assurance has two connotations. It is a type of insurance that provides a reward in the case of a covered event that occurs in the future. Assurance also refers to the assurance offered by auditing professionals regarding the authenticity and accuracy of documents and information that have been inspected. Zane's Cycles offers an exclusive "Zane's Cycles Lifetime Free Service and Parts Warranty" with every bicycle purchase. We'll make any necessary modifications to your bicycle for free as long as you own it, whether it requires a service, a full tune-up, or just a fast adjustment. At Zane's Cycles, we promise that you will never overspend. We'll gladly refund you the difference plus 10% if you find any item you bought in stock for cheaper in Connecticut within 90 days. Our Price Protection program assures you not only the finest guarantees for your bike purchase, but also the peace of mind that you got the best deal.

Empathy

Empathy is a key component of social interactions in the workplace and the marketplace. It is defined as the ability to comprehend and share customer's feelings, as well as to exhibit that understanding. Zane is always striving to improve, but it's good to have consumers who believe in our mission and are eager to share their positive buying, maintenance, and riding experiences.

Tangibles

Cash, merchandise, vehicles, equipment, buildings, and investments are all examples of these assets. Accounts receivable, pre-paid expenses, patents, and goodwill are all examples of intangible assets that do not have a physical form. Christopher J. Zane founded Zane's Cycles in Branford, Connecticut, in October 1981, when he was 16 years old. Since then, the store has evolved from a small bicycle and hobby shop to the United States' largest P&I (Premiums and Incentives) bicycle distributor. Zane's Inc today supplies the P&I market with over 65 premium household goods, despite multiple allegations. Zane's journey from small shop owner to major distributor has been featured in a number of major business journals as well as academic marketing classes. His bold local techniques were praised by both Inc. and Fortune magazines, which labelled them guerrilla marketing in some circumstances. His strategy of buying competitor's phone numbers after driving them out of business was one example. His focus on developing a bond with customers through modest gestures, such as selling children's helmets at dealer cost and providing lifetime guarantees, was discussed in a Harvard Business Review article. Reinventing the Wheel, the Science of Creating Lifetime Customers, by Christopher Zane was published in 2011 by BenBella. The book is a case study of Zane's Cycle's expansion through constant improvement of customer experience and service. Chris speaks on the subject of exceptional customer service and Customer Lifetime Value at conferences, universities, and corporate gatherings throughout the world (clv).

Financial analysis of Tesla research article

 Financial analysis of Tesla


Strategic Management analysis

Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla


Student No:0123456


MSc in Accounting and Finance with Data Analytics


Name: XYZ

 

Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla

Table of Contents

Overview: 3

Tesla: 3

Ansoff matrix: 3

Tesla Motors and Ansoff model: 4

Matrix comparison: 5

MS Kinsey matrix: 5

BCG matrix: 6

Comparison of Mickensey and BCG matrix: 7

Product Selection: 8

Tesla Attractiveness 8

Tesla strategy implementation: 13

Bowman strategy clock: 13

Generic strategy: 14

Cost leadership: 14

Differentiation leadership: 14

Roger model of Tesla Motors: 15

What is the selected market? 15

Our selected product: 15

What shall we win: 15

Sources required: 15

Management required: 15

Safe Framework for Tesla: 16

Alignment of goals and plan: 16

Built-in quality of Tesla cars: 16

Transparency: 17

Program execution methodology of tesla: 17

Product Leadership: 17

Conclusion: 18

Recommendation: 18

References 19


Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla



 Overview:

Tesla:

Tesla is a company incorporated in USA. The company was founded in the year of 2003 but after Elon musk joined the company, the potential of the company really started to show itself (Perez, 2020). Tesla business expects to sell about 23% of stock of its battery electric vehicles and 16% of stock of its plug-in hybrids globally by 2020.The true name tesla made in the market was by way of introduction of model S and Model X. Using these opportunities, tesla is investing in establishment of new factories around the world. This is also the best option for growth of tesla as more the future demand for electric vehicle is expected to rise.

Ansoff matrix:

The Ansoff matrix is used to determine which overarching strategy the company should utilise, as well as which marketing strategies should be implemented to properly aid with the overall strategy of the company (Leslie, 2022). Ansoff matrix is useful as it tells us about which strategic direction that we need to follow in order to successfully grow our business. Graphical representation of Ansoff matrix is given as:

There are four types of business strategies that are needed to be considered keeping in mind the fact that whether the criteria for such decision are met. 

Tesla Motors and Ansoff model:

Ansoff matrix provide us with four quadrants based upon different market conditions. In light of those market conditions, we can classify our company as an entity of four quadrant. This is because of the fact that our company is operating in the market of electrical vehicle that is relatively new and the product that the company is trying to sell is also relatively new.  (Dawes, 2018). This calls for diversification in the market and the product.

Diversification in the market is obtained by introducing many related but new products in the market. Because by doing do, the market customer base increase and the goodwill starts to build up. At the end, a point shall come when the market has now achieved total independence from other markets and is producing and selling products on its own. (Andrew, 2021).

Tesla can help in diversification of market by means of producing many related electric products. This was done by tesla in the form of introduction of its cyber truck. Along with its grand release hoisted by one and only Elon musk, cyber truck made everyone in the world turn their head towards the electrical market. (Yaqoob, 2021).

Matrix comparison:

The comparison between Mickens also known as the GE and BCG matrix is as follows.


The graphical representation of Mickensey matrix is given as:


Based on eh given criteria, our company lies on the cube where it occupies high amount of industry attractiveness that can is because of the market of electrical vehicle profitable. Also due to the past profits of the company along with eh amount of share that the company can raise, our company lies on the middle upper corner of the matrix. This mean that investment in the company shall be profitable decision (V.J.Thomasa, 2019). 


BCG matrix:

The specimen for BCG matrix for tesla is as follows:

BCG Matrix

High market share

Low market share

Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla


High growth rate

Featured-Star

Constant innovation

Cleaner energy production

Model 3

Question marks

Energy Storage

Solar energy

Miscellaneous accessories


Low growth rate

Cash cow

Sexy models like (Model S, Model X, Model Y)

Current best sellers along with

Power walls charger.

Deadweight-Dogs

Some manufactures complain about some issues with some models


BCG matrix put the condition of our company into four different quadrants. The explanation of these quadrants is given as

The star if our company is the continuous innovation along with the moto of cleaner production of cars. This shall help the company create market goodwill and as the market matures, this idea shall become a cash cow holding huge market share (FEEDOUGH, 2022).

Cash cow of tesla is the new model named model X and model S. These products have created a positive image of the company in the new market. And when the market ultimately matures, the company shall experience a cash cow phenomenon where the company will, due to economies of scale produce cars at less cost while the price remaining same hence profiting from its sale.

Dogs’ category shall be given to the solar electric cars. The company should not invest in this field as the market share along with market growth is slow. These products shall only create cost to the company and will be disadvantageous.

Question mark is the product of tesla named the wall charger. The market growth is high as the product will be helpful in a variety of ways but the market share is low. So, it creates a critical situation for the company as the product, on one hand can prove to be beneficial and on the other hand, due to public unacceptance, the product may flop.

Comparison of Mickensey and BCG matrix:

Mickensey matrix tells us about the decision that can be make on a specific product at a time like our model X and model S. These decisions are based upon several criteria like the geographical locations and the market potential. But BCG matrix takes into account all of its products at a time and evaluate about which one of those will be beneficial in the present or in the near future along with which one of those will create unwanted cost for the company. 

Mickensey matrix have 9 grids of decision making. But those can only e applied on a single product at a time While BCG matrix has four categories of products and it account for all the products at a same time.

Mc Kinsey matrix provide the user with the financial decision on a product based upon its factors While BCG matrix tells us about the nature of our products along with their potential but no decision-making guidance is given along with it.

Mc Kinsey matrix compare the maturity of market along with the maturity of our product where BCG tells us about the market growth rate along with market share. BCG comparison is better as the market maturity is unrelated to our company if the market share of our product is low.

The model that is most acceptable to tesla is the BCG matrix as it uses the variables of market strength and market share as a whole which is more accurate than the market and product maturity. It also takes into account all type of products at the same time so that proper comparison can be conducted of those products. 

In case of tesla, launching a new product as the Model 3 in this new market will result in the product being a star according to the BCG matrix along. Marketing these products shall require diversification, of suppliers and buyers according to the Ansoff matrix.

Product Selection:

Tesla has a number of products available to it that it can use for its strategy evaluation but we shall compare the model S. This shall be compared with the Volkswagen ID 4, a flagship electric cars series launched by Volkswagen. The following comparison is given as:

ID-4 has a much cheaper starting price at $40,760 as compared to Tesla’s model S that have a starting price of $94,990. This could result in Volkswagen to increase its market share by high amount of sales.

Tesla Model S is a total electric car having no available unit that consume any type of fuel whereas, Volkswagen’s ID-4 is semi electric and can also run-on fuel. This will increase the competitive strength of the company along with reducing the industry attractiveness as the industry is moving towards a green energy production plan that is against the use of fuel vehicles.

Tesla have sold 20,301 of its model-S in the year of 2020 while Volkswagen sold about 2755 of its units. This helps the company Volkswagen to increase its customer base and also in the diversification of its product in the market.

Tesla owns a huge overall market share in the electric industry. According to the financial data of the government, nearly 80% of all the electric vehicles registered in the US is a Tesla. This gives a window of maximum 20% market share to accommodate in for all of its competitors. In short, tesla is having a Monopoly on the market of electric vehicle.

Based on the analysis above, the company Tesla and Volkswagen seems to be equal of each other. But, due to the tesla vehicle being a total electric vehicle, the market attractiveness is high for the model S and the high profit margin on tesla vehicle is more than that of Volkswagen resulting in tesla ending up with more profits than Volkswagen. 

Tesla Attractiveness

To decide whether it would be profitable for the company to invest in its new model 

Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla



Industry Attractiveness








Business Unit 1

Business Unit 2



Factor

Weight

Rating

W. Score

Rating

W. Score


Industry Growth Rate

35

0.5

17.50

0.7

24.50


Industry size

13

1

13.00

0.3

3.90


Industry Profitability

27

0.9

24.30

1

2 7.00

Research on Financial Analysis of Tesla
Research on Financial Analysis of Tesla


Industry Structure

6

0.2

1.20

0.8

4.80


Trend of Prices

5

0.4

2.00

0.2

1.00


Market Segmentation

14

0.6

8.40

0.9

12.60









Total Score

100


66.40


73.80



COMPETITIVE STRENGTH




BUSINESS UNIT 01

BUSINESS UNIT 02


Factor

WEIGHT

Rate

W. Score

Rate

W. Score


Market Share

13

0.4

5.20

0.7

9.10


Relative Growth Rate

35

0.9

31.50

1.00

35.00


Company’s Profitability

41

1.00

41.00

0.3

12.30


Brand Value

7

0.5

3.50

0.1

0.70


VRIO Resources

2

1.00

2.00

0.4

2.00


CPM Score

2

0.4

0.80

0.9

0.80


Total Score

100


84.00


59.90





According to the financial data for tesla, the sector that is best suited for investment is the electrical car sector. This is because of the relative demand of tesla cars as compared to tesla battery. This can be easily explained through analysis of the financial statement of Tesla.





Tesla strategy implementation:

Tesla’s market attractiveness is at the all-time high in the past years. This is because of effective marketing and positive social image of Elon musk in the global market (Zhou, 2022). But, since the company has incorporated recently, the competitive strength of the industry is not at a stable level because of low amount of retained earnings as compared to its competitors. This result in the company to fall in the upper middle block of the mickensey matrix. This means that it is a good decision to invest in the industry and help in growth as the growth is probable to be profitable.

As for the Volvo, the competitive strength is high but the industry attractiveness has seen a dip in the previous years. This is because of the introduction of the electric vehicles in the market. That make the company fall in the middle-left block of the matrix. This means that it will be best for the industry to invest and grow as it has enough retained earnings to be competitive in the market.

Bowman strategy clock:

The Bowman Strategy Clock, often referred as Bowman's Business Plan, is a promotional strategy that allows a firm to analyse its position with respect to the offers of rivals. Cliff Bowman and David Faulkner, two analysts, formed it. Strategy Clock by Bowman illustrates how a company may position a product or its related service in the two dimensions (Khan, 2021). There's the price on one hand, and the perceived worth on the other. Examining various combinations of these two parameters within the Bowman Strategy Clock yields eight potential strategies grouped into four quadrants.

According to bowman strategy clock, our company, tesla is enjoying the price index of 6. That can be said as having risky high margin. The high margin Is because of the fact that our industry has little to no competition in the market of lithium-ion batteries and the electric cars. The products produced by tesla are termed as luxury items (Liang, 2022). This is because of the fact that our products have no other competitor in the market so, we can charge any type of amount for this. Tesla, if needed can even start a level 7 pricing that is known as monopoly pricing of its products. This is because Tesla have the highest market exposure then all of other companies that have entered the market and is aware of marketing strategy of these specific products.

In case of Volkswagen, the company has entered the market of electric cars and hence occupy the range of 4 to 5 in the bowman clock (Barabba, 2019). This amount is low because of the existence of tesla on the market and the number is higher than the Volvo because the market of electric cars is still yet empty.

Generic strategy:

Porter represented the generic strategies having 4 possible outcomes for our company. These are as follows:

Cost leadership.

Cost focus.

Differentiation leadership.

Differentiation focus.

Cost leadership:

The tesla company lies on the Cost leadership category, it is because of the fact that the company has a broad market competence along with distinction in cost leadership. This distinction is because of special research and development done by tesla in the field of battery and car production. The company has achieved ways of producing batteries of its vehicles in effective manner. The majority of the cost of an electrical vehicle comes from the cost of producing the lithium-ion battery (ThomasGreckhamera, 2021).

Differentiation leadership:

Tesla, in order to dominate the market, wanted one of a king individual to turn the corporation. For this reason, tesla appointed Zachary Kirkhorn as its CFO. Kirkhorn worked as a Senior Business Analyst for a well-known company named McKinsey & Company. He served in that company for for nearly three years. Kirkhorn reportedly joined the company Tesla as a Senior Analyst in the department of Finance in year 2010. He was then named as the Director of Finance in December of the year 2014, and ultimately the Vice President of Finance in December 2018. The Volkswagen has made its way to the market of electrical vehicles but, has not quite gotten a grip on understanding the market trends. It can, for now focus upon the cost focus strategy. According to this strategy, the company is to use its finances in researching better ways to reduce the cost of our selected product. Volkswagen because of long history of beneficial years of sales, has a large amount of retained earning avail that can be used to finance this research.

Roger model of Tesla Motors:

Rogers' innovation adoption curve is a model that separates innovators into groups based on the notion that some people are more adaptable than others. It's also known as Multi-Step Flow Theory or Diffusion of Innovations Theory. Innovators. Humans are naturally nice and imaginative, according to Rogers. They only become toxic when the value process is dominated by a negative self-concept or external limitations. Carl Rogers thought that being in a condition of congruence was necessary for self-actualization (poorghorban, 2021).

To account for this model, tesla needs to answer following questions:

What is the selected market?

Tesla has selected the market of electrical vehicles as the market is in its early development phases having little to no competitors. This decision was made taking into account the low amount of capital needed to enter and remain in the market. 

Our selected product:

Tesla selected a product that has the ability to run totally on electricity. Negating the use of fuel that all the other cars on road desperately need. This was done to not only bring innovation but to also attract the general public who is now inclined towards a green future.

What shall we win:

In case tesla is able to build its goodwill in the market. It can work as a monopolistic company at the time when market matures. This is because of the fact that it will be able to gain buyers trust in the market and hence the only thing remaining shall be to maintain that goodwill.

Sources required:

Since Tesla has entered the market where no research and development were conducted by previous companies, the chances for potential drawbacks and initial losses were high enough to be virtually certain. To cope with this, tesla need a hefty amount of capital to build its infrastructure from the start.

Management required:

Tesla, in order to survive the market, need a good management otherwise its story will end even before starting. To cope with this issue, tesla hired Zachary Kirkhorn who is an MBA from Harvard. He, along with the business giant Elon musk, both are responsible for leading tesla into a brighter future.

One competitor of tesla is Volkswagen. Volkswagen is one of the early majorities when it comes to the market of electric vehicles. The company has set foot on the market where the probability of competition is comparatively low as compared to the market of vehicles that has now been clustered by many manufacturers. This company can enjoy a better and profitable future in the market as by the time the market is mature, the company has produced a positive goodwill in the market.

SAF Framework for Tesla:

You may use the SAF Matrix to rank Tesla's strategic choices based on Suitability, Acceptability, and Feasibility (SAF). Analysing a SAF Matrix is another term for this process.


SAF Analysis recommends that the approach with the highest score is the best alternative.


Suitability

A strategy's suitability is evaluated in terms of its capacity to help your company achieve its objectives. It's a critical question to ask since, ultimately, you're developing a plan to achieve certain goals and objectives.

In Suitability, does the suggested strategy meet the organization's most important opportunities and threats?

When assessing someone for suitability, there are several factors to consider. Consider not just if the strategy will help you achieve your goals and objectives, but also whether it is appropriate for the Tesla culture, market, and capabilities.


Acceptability

Suitability for service in the SAF Stakeholders' perspectives are crucial to the analysis. Managing stakeholders is a crucial part of strategy since you want to make sure everyone is on the same page and on the same page only.

Acceptability, like suitability, is linked to Tesla's strategic aims. You need to know what returns you may expect from this approach, as well as how much risk there is that those gains won't materialise.

However, despite the fact that a company's return or risk may not be simply based on financial considerations, acceptability is generally linked to the data. The usage of frameworks like PESTLE Analysis and the Five Forces model, which focuses on profitability, may be an important first step in developing an acceptable strategy.


Feasibility

When it comes to strategic planning, feasibility is critical since it questions whether the firm has the ability to carry out the strategy. The resources, talents, timeliness, changes in the market, and financial resources may all be taken into account while doing this SAF Analysis.


Program execution methodology of tesla:

Program execution is at the heart of Tesla’s plan, and it drives everything else in the framework. Teams and projects must be able to provide high-quality, functioning software and commercial value on a regular basis. The tesla company have the quality and the quantity of a particular type of vehicle which, in return, creates an opening for IT firms in the state which have ample experience in human and human behaviour machine interfaces, still some of the tech giants CEO believes in the fact that the IT giants will be more focused on the plan that is mainly focused upon creating the software in machine learning future companies which are already available in the general market and there shall be no more barriers to entry . those barriers have fallen as producing high volumes of car at low cost is very different to limited production of luxury vehicles which Tesla is trying to achieve.

Product Leadership:

Product leadership is the one of the most Only leaders can modify the system and create the atmosphere required for all of the basic concepts to be adopted, hence SAFe demands a lean-agile leadership style. One of the major concern is the pressure on the CEOs of the established car companies as the median tenure of a company CEO is 5 years and the new strategic initiative will require 7 to 8 years and not having a strong quarterly results mean you can't afford the luxury of conducting that strategy initiative the successful new entrant will be the leadership holding large shares or who are privately owned such as Tesla as they can have a leverage to pursue the technology, business model,  markets trends while focusing on there long term strategy.

The Scaled Agile Framework's concepts are intended to benefit the entire firm by encouraging lean-agile decision making across functional and organisational boundaries. The principles are meant to influence not only leaders and managers' actions, but everyone in the organization's attitude toward lean-agile thinking, which includes methodologies like Lean Portfolio Management.

The competitors of Tesla like GM motors and Volkswagen have made an appropriate plan to infiltrate the market and take over the market that is still a new and immature market. Both the companies have the appropriate means of building the quality in the market because of efficient research and development department and a vast reserve of retained earnings. 

Conclusion:

Tesla has made a proper plan to make its business make its mark in the financial market of electrical vehicles. It has formed a proper plan and appointed qualified leadership to make its plan into a reality. The program has been appropriately executed and the resultant. In short when it comes to the market of electrical vehicles, tesla has the upper hand.

Recommendation:

Tesla should invest in research and development department to further reduce the cost along with increasing the efficiency of its batteries. This is because of the reason that the industry is going to be more crowded in the coming years. Competition from competitor products along with demand of competitive product is bound to increase in the coming future. To cope with this future crisis, tesla is establishing its factories in new countries availing the current governmental subsidies that can reduce the cost of expansion. This step is critical as the probabilities of further subsidies is low making the infiltration of other industries in the market much harder than before.


 

References

Andrew, J. ,. R. A., 2021. Field effectiveness of Volvo advanced driver assistance and headlighting systems. Accident Analysis & Prevention, 159(1).

Barabba, V., 2019. Assessing Volvo’ innovation strategy over three decades using the “Three Box Solution”. Strategy & Leadership, 47(2).

Chawla, N., 2021. A CASE STUDY ON VOLKSWAGEN PRINT AD THINK SMALL. PALARCH’S JOURNAL OF PALAEONTOLOGY AND EGYPTOLOGY, 18(17).

Chiu, C.-C., 2019. Rule-Based BCG Matrix for Product Portfolio Analysis. International Conference on Software Engineering, Artificial Intelligence,, 1(1), p. 17–32.

Dawes, J., 2018. The Ansoff Matrix: A Legendary Tool, But with Two Logical Problems. 1(1), p. 5.

GURCAYLILAR, 2018. APPLYING ANSOFF’S GROWTH STRATEGY MATRIX TO INNOVATION CLASSIFICATION. international Journal of Innovation Management, 22(4).

Khan, M. R., 2021. A critical analysis of Elon Musk’s leadership in Tesla motors. Journal of Global Entrepreneurship Research, 1(1).

Leslie, 2022. Analysis of the Field Effectiveness of Volvo Model Year 2013-2020 Advanced Driver Assistance System Features. Transportation Research Institute (UMTRI), 1(1).

Liang, J., 2022. A Case Study of Marketing at Tesla Based on the 4V Theory. Proceedings of the 2022 7th International Conference on Social Sciences and Economic Development, 1(1).

Perez, Y. C. &. Y., 2020. Business Model Design: Lessons Learned from Tesla Motors. Towards a Sustainable Economy , 1(1), p. 53–69.

poorghorban, M. R., 2021. Application of Hall-Roger model in measuring the profit margin of basic metals industry. Economic research, 5(16), pp. 25-34.

Putta, 2018. Benefits and Challenges of Adopting the Scaled Agile Framework (SAFe): Preliminary Results from a Multivocal Literature Review. International Conference on Product-Focused Software Process Improvement, 1(1), p. 334–351.

Putta, A., 2018. Benefits and Challenges of Adopting the Scaled Agile Framework (SAFe): Preliminary Results from a Multivocal Literature Review. International Conference on Product-Focused Software Process Improvement, 1(1), p. 334–351.

Razmi, J., 2020. An Assessment Model of McKinsey 7S Model-Based Framework for Knowledge Management Maturity in Agility Promotion. Journal of Information & Knowledge Management, 19(4).

ThomasGreckhamera, F. A., 2021. generic strategies: A set-theoretic configurational approach. Long Range Planning, 54(2).

V.J.Thomasa, E., 2019. Market entry strategies for electric vehicle start-ups in the automotive industry – Lessons from Tesla Motors. Journal of Cleaner Production, 235(1), pp. 653-663.

Yaqoob, K., 2021. The Role of Digital Marketing in Promoting Ansoff Matrix Strategies: A survey study in Al-Alamiah store in Mosul City. Muthanna Journal of Administrative and Economic Sciences, 11(3), pp. 241-255.

Zhou, Y., 2022. Proceedings of the 2022 7th International Conference on Social Sciences and Economic Development. Advances in Economics, Business and Management Research, 1(1).


Risk Management Research Article


Risk Management

 

Table of Content

Sr.

Title

Page No.1.

Executive Summary2.

Introduction 

Literature Review

Empirical Results

Conclusion 

References 

Appendix 

DOWNLOAD COMPLETE ARTICLE

Risk Management Research Analysis
Risk Management Research Analysis


 

Executive Summary

The danger of borrowers defaulting on their loan obligations is referred to as credit risk. A huge number of institutions have created sophisticated risk quantification, aggregation, and management systems and models in recent years. These models' outputs are also becoming more essential in risk management and performance monitoring processes at banks. We attempt to address the issue of short-term loan default prediction for a Tunisian commercial bank in this research. From 2003 to 2006, we used a database of 924 credit records given to Tunisian businesses by a Tunisian commercial bank. The findings of the K-Nearest Neighbor classifier technique show that the best information set is related to accrual and cash-flow, and the good classification rate is in the order of 88.63 percent (for k=3). To analyze the model's performance, a ROC curve is plotted. The AUC (Area Under Curve) criterion for the first model is 87.4 percent, 95 percent for the third model, and 95.6 percent for the best model incorporating cash flow information.

 

Introduction

The assessment of bank credit risk is frequently utilized by banks all around the world. Because credit risk assessment is so important, a variety of methods are employed to determine risk levels. Furthermore, one of the financial community's primary tasks is credit risk management (Serval, 2008). Credit risk is defined by the Basel Committee on Banking Supervision as the risk that a bank borrower or counterparty may fail to meet agreed-upon obligations. Clients are classified by their profile by banks. Customers' financial backgrounds and subjective criteria are assessed during the classification process. Financial ratios are crucial in determining risk levels (Berk et al., 2011). These are objective ratios that show the financial statement of a company. Financial documents such as the balance sheet, income statement, and cash flow statement are used to gather data for calculating objective financial ratios. There are numerous more subjective criteria, which are dependent on the bank's decision-making approach and mission (Berk et al., 2011).

In a consulting document, the Basel Committee on Banking Supervision attempted to provide guidance to banks and supervisors on appropriate credit risk assessment and valuation policies and practices for loans, regardless of the accounting methodology used. A bank's policies should correctly address validation of any internal credit risk assessment models, according to the third principle in this article. This principle's implementation turns out to be a daily decision based on a binary classification issue that distinguishes excellent payers from bad payers (Karaa & Krichene, 2012). Clearly, assessing insolvency plays a significant function, as a good evaluation of a borrower's quality can aid in deciding whether or not to issue the sought loans. The Basel Committee suggests using either an external mapping technique or an internal rating system to calculate credit risk capital needs (Karaa & Krichene, 2012).

Although the external mapping strategy is difficult to apply due to the lack of external rating grades, the internal rating approach is straightforward and straightforward to adopt, since many techniques for developing credit-risk assessment models have been proposed in the literature. Furthermore, the subprime mortgage crisis, which has rocked the United States and Europe and revealed the banking sector's vulnerability, has cast doubt on the accuracy and utility of agency ratings (Matoussi & Abdelmoula, 2009). Credit scoring methodologies, in reality, are used to assess both objective and subjective variables. In the 1950s, these methods became popular all across the world (Abramowicz et al., 2003). The collecting of client information is standardized using these ways. Furthermore, the scoring system is used to determine whether or not a loan should be approved. Traditional statistical techniques such as logistic regression (Steenackers & Goovaerts, 1989), multivariate discriminant analysis (MDA) (Altman, 1968), classification trees (Davis et al., 1992), neural network (NN) models (Desai et al., 1996, Matoussi & Abdelmoula, 2009; Karaa and Krichène, 2012), and nonparametric statistical models such as k-nearest neighbour (1997). Bayesian classification rules utilizing Nave Bayes classifiers have been proposed in recent contributions. The findings of these investigations showed that they can often outperform the most commonly used strategies. In this context, bankruptcy prediction models developed by Sarkar and Sriram (2001) and Sun and Shenoy (2007) were successful.

Risk Management Research Analysis
Risk Management Research Analysis


Literature Review

Theoretical Framework of Credit Risk Problem

The formulation of the optimal form of the lending contract is one of the most significant applications of agency theory to the lender-borrower dilemma. There is an information imbalance in the credit market between the borrower, who usually has better information about the investment project and its possible earnings and risks, and the lender (the bank), who does not have enough and reliable information about the investment project. This lack of quantity and quality information causes issues both before and after the transaction. Moral hazard and unfavorable selection are common when asymmetric information is present. A classic principal-agent dilemma can be seen in this scenario. According to the nature of information asymmetry, the principal-agent models of agency theory can be classified into three groups (Karel, 2006). To begin, we look for models that are classified as moral hazard because they have ex-post asymmetric knowledge. After signing the contract, the agent is given some confidential information. Moral hazard happens when an asymmetric information problem arises after a transaction has taken place. Because the borrower possesses information about the project that the lender does not, the lender runs the risk of the borrower engaging in activities that the lender does not want because they make it less likely that the loan will be repaid (Matoussi & Abdelmoula, 2009).

Second, we seek out unfavorable selection models, which have ex-ante asymmetric information (Karel, 2006). Before signing the contract, the agent in these models gets access to personal information. Adverse Selection occurs when a borrower has significant information about the quality of a project that the lender does not (or vice versa) before the transaction takes place. This occurs when the potential borrowers who are most likely to have a negative outcome (poor credit risks) are the most engaged in seeking a loan and hence are the most likely to be chosen. Because the riskiness of projects is unknown, lenders' pricing cannot distinguish between good and bad borrowers in the basic situation. Finally, there is signaling, which is the third class.

This issue has usually been studied within the context of costly state verification, which was first introduced by (Townsend, 1979). The agent, who has no endowment, borrows money from the principal to fund a one-time investment project. A moral hazard issue confronts the agent. Should he report the true value or should he reduce the project's outcome? Ex-post moral hazard is demonstrated in this instance. Furthermore, we may encounter an ex-ante moral hazard situation, in which an agent's unobservable effort during the project implementation may have an impact on the project's outcome. According to Townsend (1979), the best contract for solving this problem is the ordinary (or simple) debt contract. The face value of this conventional debt contract is the amount that the agent must repay once the project is completed. (Diamond, 1984) proposed another theoretical basis for simple debt contracts, in which a costly punishment replaced the costly state verification. Only under the risk neutrality assumption, according to Hellwig (2000, 2001), are the two models comparable. When risk aversion is introduced, the costly state verification model continues to work, but the costly punishment model fails. Credit institutions can deal with the asymmetric information problem and its repercussions on credit risk appraisal in the real world by using either guarantee (collateral) or bankruptcy prediction modelling, or both (Karaa & Krichène, 2012).

Risk Management Research Analysis
Risk Management Research Analysis


Credit Risk Assessment and Bankruptcy Prediction

Following a rash of high-profile bank failures in Asia, regulators have recognized the need for advanced technology to assess credit risk in bank portfolios and have urged banks to use it. Correctly assessing credit risk also allows banks to engineer future lending transactions to meet specific return/risk criteria. Credit risk analysis necessitates the development of reasonably accurate quantitative prediction models that can serve as early warning signals for counterparty defaults. In the literature, a number of researchers presented two primary approaches to credit scoring. The first approach, known as structural or market-based models, was proposed by (Merton, 1974), and is based on modelling the underlying dynamics of interest rates and business characteristics to derive the default probability derivation. Initially, this method is based on the asset value model, which has an endogenous default process and is linked to the firm's capital structure. When the value of a company's assets falls below a certain threshold, it is said to have defaulted (Crouhy et al., 2000).

The second method relies on empirical or accounting-based models, in which the relationship between default likelihood and business characteristics is discovered from data rather than modelled. Some strategies in this area were synthesized by Raymond (2007), Thomas et al. (2002), Galindo and Tamayo (2000). Academics and practitioners have researched bankruptcy prediction extensively, as evidenced by the studies of Beaver (1966) and Altman (1968). There have been several models created and empirically tested. Altman's well-known Z-Score (Altman, 1968) is a linear discriminant analysis model that was used to forecast the likelihood of a corporation defaulting. The Ohlson O-Score (Ohlson, 1980) is based on generalized linear models or multiple logistic regression, which have been used to determine the best predictors of bankruptcy and the prediction accuracy rate of their occurrence. In order to anticipate bankruptcy, neural network models were tweaked and employed (Atya, 2001; Matoussi & Abdelmoula, 2009). With the capacity to incorporate a large number of features in an adaptive nonlinear model, their strong predictive capability makes them a popular choice (Kay & Titterington, 2000).

Many studies have concentrated on non-parametric approaches (e.g., k-nearest neighbour) (Henley & Hand, 1996), decision trees (Quinlan, 1992), and neural networks (Mcculloch & Pitts, 1943). Other classification systems, such as Support Vector Machine, combine various techniques to construct a classification model (e.g., Lee and Chen, 2005; Lee et al., 2002). West (2000) compared the credit scoring accuracy of five Artificial Neural Network models, including multilayer perceptron, radial basis function, fuzzy adaptive resonance, mixture-of-experts, and learning vector quantization. West (2000) employed two real-world data sets, one Australian and the other German, in his research. In order to improve his predictive power, he used tenfold cross validation. Both good and terrible credit rates were mentioned. Finally, he compared the results to those of five other common techniques: linear discriminant analysis, logistic regression, k closest neighbour, kernel density estimation, and decision trees. The findings suggest that multilayer perception may not be the most accurate Artificial Neural Network model, and that credit scoring applications should include both combination-of-experts and radial basis function Neural Network models. In the typical situation, logistic regression is a more accurate and precise method than Neural Network models when compared to older methods.

Despite extensive research into credit scoring, Vera et al. (2012) claim that no consensus exists on the best classification technique to utilize. When comparing the outcomes of different investigations, Baesens et al. (2003b) discovered that certain disputes can occur. However, according to Thomas et al. (2002), most credit scoring techniques function similarly. Indeed, the interpretability and openness of certain procedures may lead banks and financial organizations to choose them (Martens et al., 2009). According to Vera et al. (2002), the predictive performance of credit scoring algorithms, as well as the insights or interpretations offered by the model, are both very essential.

Risk Management Research Analysis
Risk Management Research Analysis


Empirical Research Design

Banks operate in a highly competitive market, so the quality of service provided during credit risk assessment is critical. To obtain a competitive edge, when a customer requests credit from a bank, the bank should examine the request as quickly as feasible (Berk et al., 2011). Furthermore, the same process is repeated for each credit demand, which costs the bank money. Because of the importance of credit risk analysis, financial institutions have created a variety of approaches and models to help them decide whether or not to extend credit (inko, 2006).

Parametric and non-parametric problems are the two types of classification methods. In fact, parametric techniques solve problems by estimating the parameters of distributions based on the assumptions of regularly distributed populations (Zhang et al., 2007). Non-parametric approaches, on the other hand, do not make assumptions about the individual distributions involved, and thus are distribution-free, according to Berry and Linoff (1997). A non-parametric statistical method is exemplified by the k-nearest neighbour classifier. A K-NN classifier searches the pattern space for k training (Pranab & Radha, 2013) instances that are comparable to unknown cases when given an unknown case. The K-nearest neighbours of the unknown cases are the k training cases (Ravinder & Aggarwal, 2011). When the dependent variable has multiple values, such as high risk, medium risk, and low risk, the K-NN classifier can be effective. Furthermore, for optimal performance, the K-NN classifier needs an equal number of good and bad sample examples (Hand & Henley, 1997). The k-NN algorithm's performance is also affected by the choice of k, according to Berry and Linoff (1997). This is something that can be tested. To assess the classifier's error rate, we utilize a test case starting with k=1. Each time, k is increased to accommodate one more neighbour, and the procedure is repeated. It is possible to choose the K-value that yields the lowest mistake rate. The larger the number of training samples, the higher the value of k.

ROC Curve as Performance Classifier

Receiver Operating Characteristics (ROC) is a performance graphing method that is widely used. In other words, a ROC graph is a visual, organizational, and selection approach for classifiers based on their performance. Fawcett is a fictional character who appears in the television series (2006). ROC graphs were first used in machine learning by Spackman (1989). He showed how ROC curves might be used to evaluate and compare algorithms (Fawcett, 2006). ROC graphs have become increasingly popular in the machine learning community in recent years. Because basic classification accuracy is typically an inadequate criterion for assessing performance (Provost & Fawcett, 1997; Provost et al., 1998). They also have characteristics that make them particularly beneficial in areas with skewed class distribution and disproportionate classification mistake costs (Fawcett, 2006).


Risk Management Research Analysis
Risk Management Research Analysis


The performance of a classifier is represented by a ROC curve, which is a two-dimensional graph. To compare classifiers, Fawcett (2006) recommends reducing ROC performance to a single scalar value that represents predicted performance. Many scholars, including Bradley (1997) and Hanley and McNeil (1982), propose calculating the area under the ROC curve, also known as AUC. The AUC is a percentage of the unit square's area, with a value that is always between 0 and 1.0. However, no realistic classifier should have an AUC smaller than 0.5, because random guessing yields the diagonal line between (0, 0) and (1, 1), which has an area of 0.5. (Fawcett, 2006).

Empirical Results

We created three different K-NN classifiers in our experiment. Data on financial ratios is used by the first classifier (cash-flows excluded). Non-cash-flow model will be used to describe it. The second model incorporates information from all ratio indicators (cash-flows included, collateral excluded). The cash-flow model will be used. All of the study's indicators are used in the third model. 'Full information model' will be the term used. The k-nearest neighbour (k-NN) methodology, according to Rafiul et al. (2008), is a simple and intuitively appealing strategy for dealing with classification difficulties because of its interpretable nature. Choosing an acceptable distance function for k-NN, on the other hand, can be difficult, and a poor choice can make the classifier extremely susceptible to data noise. We experimented with a variety of k values in our research (2, 3, 4 and 5). Based on our testing, we determined the best value of k for k-NN, which resulted in the best classification performance, which is presented in the result tables.

k-NN classifier with variation of the parameter k=2


Risk Management Research Analysis
Risk Management Research Analysis


Healthy 

Risky 


Healthy companies

358

100


Risky companies

100

366


% Total Good and Bad Classification


Good classification

78.35%


Bad classification

21.64%



Results for Cash-Flow models


K=2


K=3


K=4


K=5




Healthy

Risky

Healthy

Risky

Healthy

Risky

Healthy

Risky


Healthy

395

63

409

49

387

71

375

83


companies










Risky

59

407

56

410

72

394

92

374


companies










% Total Good and Bad Classification


Good

86.79%

88.63%

84.52%

81.06%


classification






Bad

13.20%

11.37%

15.48%

19.94%


classification








Results for full information models


K=2


K=3


K=4


K=5




Healthy

Risky

Healthy

Risky

Healthy

Risky

Healthy

Risky


Healthy

companies

393

65

406

52

381

77

383

75


Risky

companies

69

397

69

397

99

367

113

353


% Total Good a Good classification

nd Bad Classification

85.5%


86.90%


80.95%


79.65%


Bad

classification

14.50%

13.10%

19.05%

20.35%



The categorization results for both cash flow and complete information models. The classifiers with the best performance are bolded and highlighted in red. Tables 4 and 5 demonstrate that the best model with the best classification rate is the one combining accrual and cash flow data (table 4) with an excellent classification rate of 88.63 percent vs 86.90 percent for the third model with all data. Remember that our goal is to determine the new point's class label. The behaviour of the algorithm varies depending on k, and the value of K in this study was chosen. We can also come to the conclusion that k –NN is the best parameter. The findings were improved by increasing the parameter k to three. The percentage of correctly classified items is improving. Furthermore, when cash flow information is included, the model reduces error type I from 16.73 percent to 12 percent and error type II from 20.52 percent to 10.69 percent.

Criterion of the type I and II error


ERROR

K=2

K=3

K=4

K=5


NON-CASH FLOW MODEL

Type I

21.83%

16.73%

27.51%

27.25%



Type II

21.45%

20.52%

26.18%

27.72%


CASH FLOW MODEL

Type I

12.66%

12.01%

15.45%

19.74%



Type II

13.75%

10.69%

15.50%

18.12%


FULL INFORMATION MODEL

Type I

14.8%

14.80%

21.24%

24.24%



We want to estimate credit risk in this study by employing a set of financial ratios that are commonly used in loan contracts. The predictions on financial ratio selection show the link between financial ratios and credit risk. In the practitioner and academic literature, this evidence is well-known (Demerjian, 2007). Indeed, textbooks emphasize the importance of ratios in evaluating credit quality (Lundholm & Sloan, 2004), and academic research find that when employed as covenants, financial ratios give indications regarding borrower credit risk (Smith & Warner, 1979; Dichev & Skinner, 2002).

The ROC Curve

A perfect classifier's ROC curve is the curve that follows the two axes and orders all bad cases before good cases. For a given value of the sill, it would classify 100% bad situations as class bad and 0% good ones as class bad. A classifier with a ROC curve that follows the 45° line is ineffective, according to Yang (2002). At each value of the threshold, the same proportion of bad and good cases would be assigned to the class bad; the classes would not be separated. Between these two extremes, real-world classifiers develop ROC curves. The Area Under the ROC Curve (abbreviated as AUC) is a metric that can be used to assess the curve's performance (Hand, 1997). The curve with a higher AUC is preferable than the one with a lower AUC. We can see that the AUC requirement for the best model is around 95.6 percent (cash flow model). This is a good score because it is greater than 50%. This result backs with the preceding section's finding of a high categorization rate. We might conclude that cash flow data is a useful signal for bankers assessing lending applicants.




Conclusion

Client-loaning commercial banks require consistent models that can accurately detect and anticipate defaults. In the current competitive and unpredictable economic environment, Moonasar (2007) stressed that reducing credit risk is one of the most important issues that each bank must deal with. To assess a credit applicant's creditworthiness, we traditionally use scoring systems. In truth, credit scoring is a mathematical tool for assessing creditworthiness (Yang, 2002). Credit scoring is a classification process that divides borrowers into risk groups. Scoring systems are used to estimate the likelihood of a borrower or counterparty defaulting (Komor'ad, 2002). Credit scoring is crucial in assessing credit risk. According to Moonasar (2007), a popular way to credit scoring is to use a classification algorithm using data from prior customers to identify a link between the customers' qualities and their likelihood of defaulting on their debt. Credit scoring systems are used by lenders to evaluate whether applicants are likely to repay their debts. To distinguish between possible good and bad credit applicants, an accurate classifier is required. The best K-NN with k=3 for the three models, and the best global classification rate is in the order of 88.63 percent, according to the primary results. Additionally, a ROC curve is shown to evaluate the model's performance. The AUC requirement is 95.6 percent, according to the results. Our research, on the other hand, is lacking in that it.

 

References

Abid, F. & A. Zouari (2000) “Financial distress prediction using neural networks”, http://ssrn.com/abstract=355980 or DOI: 10.2139/ssrn.355980.

Abramowicz, W. M. Nowak, J. Sztykiel (2003) “Bayesian networks as a decision support tool in credit scoring domain”, Idea Group Publishing

Altman, E. I. (1968) “Financial ratios, discriminant analysis and the prediction of corporate bankruptcy”, Journal of Finance, vol. 23: 589–609

Anderson, D.R., Sweeney, D.J., Freeman, J., Williams T.A. & Shoesmith, E. (2007) “Statistics for business and economics”, London: Thomson Learning EMEA

Antonakis, A. C. & Sfakianakis, M. E. (2009) “Assessing naive bayes as a method for screening credit applicants”, Journal of Applied Statistics, vol. 36: 537-545

Atiya, A.F. (2001) “Bankruptcy prediction for credit risk using neural nets: a survey and new results”, IEEE Transactions on Neural Nets, vol. 12 (4): 929-935

Beaver, W. (1966) “Financial ratios as predictors of failure. Empirical research in accounting: Selected studies”, Journal of Accounting Research, vol. 5: 71–111

Berk Bekiroglu Hidayet Takci1 & Utku Can Ekinci (2011) “Bank credit risk analysis with Bayesian network decision tool” International Journal of Advanced Engineering Sciences and Technologies, vol. 9, no. 2: 273-279

Berry. M.J.A. & Linoff, G.S (1997) Data mining techniques for marketing, sales, and customer support, John Wiley & Sons, Inc.

Berstein, L. A. & Wild J.J. (1998) Financial statement analysis: theory, application, and interpretation, sixth Edition, McGraw-Hill

Bogess, W. P. (1967) “Screen -test your credit risks”, Harvard Business Review, vol. 45, no. 6: 113-122

Bradley, A.P. (1997) “The use of the area under the ROC curve in the evaluation of machine learning algorithms”, Pattern Recogn, vol. 30(7): 1145-1159

Çinko, M. (2006) “Comparison of credit scoring tecniques: Istanbul ticaret üniversitesi sosyal bilimler”, Dergisi, vol. 9: 143-153

Crouhy, M.; Galai, D.; Mark, R. (2000) “A comparative analysis of current credit risk models”, Journal of Banking and Finance, vol. 24, no. 1: 59-117

Davis R. H., Edelman, D.B. & Gammerman, A.J. (1992) “Machine learning algorithms for credit-card applications”, IMA Journal of Management Mathematics, vol. 4: 43-51

Davutyan, N. & Özar, S. (2006) “A credit scoring model for Turkey’s micro & small enterprises (MSE’s),” 13th Annual   ERF   Conference, Kuwait, 16 – 18 December 2006

Demerjian, P. R. W (2007) “Financial ratios and credit risk: the selection of financial ratio covenants in debt contracts”, working paper, workshop Stephen M. Ross School of Business University of Michigan, January 11

Desai, V. S., Crook, J. N. & Overstreet, G. A. (1996) “A comparison of neural networks and linear scoring models in the credit union environment”, European Journal of Operational Research, vol. 95(1): 24–37

Galindo, J. & Tamayo, P. (2000) “Credit risk assessment using statistical and MachineLearning: basic methodology and risk modeling applications”, Computational Economics, vol. 15(1-2): 107- 143

Hand, D. J. (1997) Construction and assessment of classification rules, Wiley series in probability and statistics, John Wiley & Sons

Hand, J. & Henley, W. (1997) “Statistical classification methods in consumer credit scoring”, Computer Journal of the Royal Statistical Society Series a Statistics in Society”, vol. 160, no. 3: 523-541

Hanley, J.A. & McNeil, B.J. (1982) “The meaning and use of the area under a receiver operating characteristic (ROC) curve”, Radiology, vol. 143: 29–36

Hellwig, M. (2000) “Financial intermediation with risk aversion”, Review of Economic Studies, vol. 67(4): 719–742

Hellwig M. (2001) “Risk aversion and incentive compatibility with ex post information Asymmetry”, Economic Theory, vol. 18 (2):415–438.

Henley, W.E. & Hand, D.J. (1997) “Statistical classification methods in consumer credit scoring: a review”, Journal of the Royal Statistical Society. Series A (Statistics in society), vol. 160, no. 3: 523- 541

Karaa,A. & Krichène, A. (2012) “Credit–risk assessment using support vectors machine and multilayer neural network models: a comparative study case of a Tunisian bank”, Accounting and  Management  Information  Systems,  vol. 11, no. 4: 587–620

Karel.J. (2006) “Agency theory approach to the contracting between lender and borrower” Acta Oeconomica Pragensia, 14/3

Kay, J. & Titterington, M. (eds) (2000) “Statistics and Neural Nets, Advances at the Interface”, Oxford University Press

Komorad, K. (2002) “On credit scoring estimation”, Institute for statistics and econometrics, Humboldt University, Berlin

Lee, T. & Chen, I. (2005) “A two-stage hybrid credit scoring model using artificial neural networks and multivariate adaptive regression splines”, Expert Systems with Applications, vol. 28(4): 743–752

Lee, T., Chiu, C., Lu, C. & Chen, I. (2002) “Credit scoring using the hybrid neural discriminant technique”, Expert Systems with Applications, vol. 23(3): 245-254

Lundholm, R. & Sloan, R. (2004) Equity valuation & analysis, New York; McGraw-Hill/Irwin

Matoussi, H. & Krichène Abdelmoula, A. (2010) “Credit risk evaluation of a Tunisian commercial: Bank: logistic regression versus Neural Network Modelling”, Accounting and Management Information Systems, vol. 9, no. 1

Matoussi, H., Mouelhi, R. & Salah, S. (1999) “La prédiction de faillite des entreprises tunisiennes par la régression logistique”, Revue Tunisienne des Sciences de Gestion, vol. 1: 90-106

Mcculloch, W. & Pitts, W. (1943) “A logical calculus of the ideas immanent in nervous activity”, Bulletin of Mathematical Biophysic, vol. 5: 115-133

Merton, R. (1974) “On the pricing of corporate debt: The risk structure of interest rates,” Journal of Finance, vol. 29: 449-470

Mileris, R. (2010) “Estimation of loan applicants default probability applying discriminant analysis and simple Bayesian classifier”, Economics and Management, vol. 15: 1078-1084

Moonasar, V. (2007) “Credit risk analysis using artificial intelligence: evidence from a leading South African banking institution”, Research Report: Mbl3

Ohlson, J. A. (1980) “Financial ratios and the probabilistic prediction of bankruptcy”, Journal of Accounting Research, vol. 18: 109-131

Palepu K.G., Healy, P.M. & Bernard, V.L. (2000) Business analysis & valuation using financial Statements, second Edition, South – Western College Publishing

Pranab Kumar D. G., Radha Krishna, P., (2013) «Database management system oracle SQL AND PL/SQL” PHI Learning Pvt. Ltd., 576 pages

Provost, F. & Fawcett, T. (1997) “Analysis and visualization of classifier performance: Comparison under imprecise class and cost distributions”, In: Proc. Third Internat. Conf. on Knowledge Discovery and Data Mining (KDD-97). AAAI Press, Menlo Park

Provost, F., Fawcett, T. & Kohavi, R. (1998) “The case against accuracy estimation for comparing induction algorithms”, In: Shavlik, J. (Ed.) Proc. ICML-98. Morgan Kaufmann, San Francisco, Available from: <http://www.purl.org/ NET/tfawcett/papers/ICML98-final.ps.gz>.

Quinlan, J. R. (1992) C4.5 “programs for machine learning”, Morgan Kaufmann Publishers Inc., California

Raymond, A. (2007) The Credit Scoring Toolkit: Theory and Practice for Retail Credit Risk Management and Decision Automation, Oxford University Press, United States of America, 1st edition.

Revsine, L., Collins, D.W. & Johnson, W.B. (1999) Financial Statement and Analysis, Prentice Hall, New Jersey.

Rosner, B.A. (2006) Fundamental of Biostatistics, Taunton: Quebecor World Rumelhart, D.E., Hinton, G.E. & McClelland, J.L. (1986) “A general framework for parallel distributed processing”, In “Parallel Distributed Processing: explorations in the microstructure of cognition”, vol. 1, pp. 45-75

Sarkar, S. & Sriram, R.S. (2001) “Bayesian models for early warning of bank failures”, Management Science, vol. 47(11): 1457-1475

Seval, S. (2008) “Credit risk and Basel II”, Credit Risk Solutions Inforsense Smith, C. & Warner, J. (1979) “On financial contracting”, Journal of Financial Economics, vol. 7: 117-161

Spackman, K.A. (1989) “Signal detection theory: Valuable tools for evaluating inductive learning”, In: Proc. Sixth Internat. Workshop on Machine Learning, Morgan Kaufman, San Mateo, CA, pp. 160-163.

Steenackers, A. & Goovaerts, M.J. (1989) “A credit scoring model for personal loans”, Insurance Mathematics and Economics, vol. 8: 31-34

Thomas, L.C. (2002) “A survey of credit and behavioral scoring: forecasting financial risk of lending to consumers”, International Journal of Forecasting, vol. 15: 149-172

Townsend, R. M. (1979) “Optimal contracts and competitive markets with costly state verification”, Journal of Economic Theory, vol.21 (2): 265-293

Vera L. M., Dries F.B. & Van den Poel, D. (2012) “Enhanced decision support in credit scoring using Bayesian binary quantile regression”, Working Paper August

West, D. (2000) “Neural network credit scoring”, Computer & Operations Research, vol. 27 (11): 1131-1152

West, D. (2000) “Neural network credit scoring model”, Computational Operation Research vol. 27: 1131-1152

Yang, L. (2002) “The evaluation of classification models for credit scoring”, Working Paper no. 02/2002 Edit. Matthias Schumann University of Göttingen Institute of computer science

Zhang, D., Huang, H., Chen, Q. & Jiang, Y. (2007) “Comparison of credit scoring models”, Third international conference of Natural Computation, vol. 1

 

Appendix

APPENDIX 1: NON-CASH FLOW MODEL

Panel 1: K-NN with k=2


Panel 2: K-NN with k=3

Panel 3: K-NN with k=4

Close Menu