Conference PaperPDF Available

In-Memory Technology and the Agility of Business Intelligence – A Case Study at a German Sportswear Company

Authors:

Abstract and Figures

The retail industry has changed significantly due to altered shopping behavior of customers and technological advancements in recent years. This enforces organizations to quickly adapt to these dynamically evolving circumstances. Most of the major organizations utilize Business intelligence (BI) to support their corporate strategies. Therefore, the adaptability of BI gained increasing importance in theory and industry practice over the last years. Agility is particularly challenging in the domain of BI since the underlying architecture of enterprise-wide decision support with data warehouse (DWH)-based BI is not built upon agility, but on reliability and robustness. Although the usage of agile project approaches like Scrum has been explored, there is still a lack of research investigating further effects on BI agility. Hence, we analyzed whether the characteristics of DWH and BI impact the agility of BI in an in-depth case study at a globally operating German sportswear designer and manufacturer. In particular, we want to identify if a technology like in-memory can help to achieve more BI agility. The findings indicate that IM technology acts as a technology enabler for agile BI. The impact of some DWH characteristics on BI agility is significantly positively influenced if IM technology is used.
No caption available
… 
Content may be subject to copyright.
IN-MEMORY TECHNOLOGY AND THE AGILITY OF
BUSINESS INTELLIGENCE – A CASE STUDY AT A GERMAN
SPORTSWEAR COMPANY
Tobias Knabke, Mercator School of Management, University of Duisburg-Essen, Duisburg,
Germany, tobias.knabke@uni-due.de
Sebastian Olbrich, Mercator School of Management, University of Duisburg-Essen, Duisburg,
Germany, sebastian.olbrich@uni-due.de
Abstract
The retail industry has changed significantly due to altered shopping behavior of customers and
technological advancements in recent years. This enforces organizations to quickly adapt to these
dynamically evolving circumstances. Most of the major organizations utilize Business intelligence (BI)
to support their corporate strategies. Therefore, the adaptability of BI gained increasing importance
in theory and industry practice over the last years. Agility is particularly challenging in the domain of
BI since the underlying architecture of enterprise-wide decision support with data warehouse (DWH)-
based BI is not built upon agility, but on reliability and robustness. Although the usage of agile project
approaches like Scrum has been explored, there is still a lack of research investigating further effects
on BI agility. Hence, we analyzed whether the characteristics of DWH and BI impact the agility of BI
in an in-depth case study at a globally operating German sportswear designer and manufacturer. In
particular, we want to identify if a technology like in-memory can help to achieve more BI agility. The
findings indicate that IM technology acts as a technology enabler for agile BI. The impact of some
DWH characteristics on BI agility is significantly positively influenced if IM technology is used.
Keywords: Business Intelligence, Agility, In-Memory Database, Case Study.
1 INTRODUCTION AND MOTIVATION
The empowerment of the consumer by using emerging technologies altered the retail industry
dramatically in recent years. For instance, customers use smartphones to compare prices while
shopping in stores or base their buying decisions on feedbacks via social media platforms.
Additionally, an ever-growing list of online retailers offer to deliver products with competitive prices
directly to the customers (MacKenzie et al. 2013). Market research (Davison and Burt 2014;
MacKenzie et al. 2013; Mulpuru et al. 2014) indicates that these business and technology trends
completely change retail as we know it today. Some of these industry observers even predict more
changes in the next few years than there were over the past century. Thus, retail organizations need to
quickly adapt to this dynamically changing environment in order to stay successful. Managers and
decision makers of retail companies must be able to promptly answer questions like where to sell
products (large or small stores and/or online) or how to react to different pricings of competitors. They
need to operate and adapt their multichannel strategies flexibly and gain insights from huge amount of
customer or social media data frequently.
Hence, fact-based decision making on a broad and reliable data basis and being prepared for multiple
scenarios is one of the most obvious approaches. Decision making as well as the execution of business
processes is usually supported by information systems (IS). Business intelligence (BI), as a distinct
class of dispositive IS, is used as an instrument to understand and gain insights from internal and
external information and the top priorities of CIOs in 2014 (Schulte et al. 2013). Primarily utilized to
reflect an operational performance (reporting-centric), organizations tend to use BI more and more to
actively steer the future. Therefore, a quick adoption is crucial to support timely decision making for
retailers as described above. But, achieving agility is particularly challenging in the domain of BI
(Moss 2009). The vision for BI has traditionally been a single, central repository of data that supports
operational and analytical functions for the entire organization. So the tasks of reporting and
consolidation typically have rigid requirements in terms of robustness, reliability, and non-volatility of
the data provided by the system (Inmon 1996). On the other hand, BI needs to adjust to changing
situations and must collect an enormous amount of data of the surrounding environment (Chen et al.
2012; Gartner 2011; Redman 2008). Many organizations utilize a data warehouse (DWH) as the basic
concept for BI. As a DWH is rather static by design the question remains how BI can be adopted faster
and therefore behave in a more agile way. Although the usage of agile project approaches like Scrum
(Schwaber 1997) has been explored, there is still a lack of research investigating further effects on BI
agility. Such effects may be achieved by different architectural approaches (Caruso 2011), adequate
organizational structures and processes (Zimmer et al. 2012) or technologies such as in-memory (IM)
(Evelson 2011). Current research activities identified positive impacts of in-memory databases
(IMDB) on BI (Knabke et al. 2014; Knabke and Olbrich 2011; Plattner 2009; Plattner and Zeier 2011).
But, the impact of IMDB on the agility of BI has not been sufficiently investigated and mostly
promoted by software vendors. Thus, the aim of this paper is to investigate if and how the usage of
IMDB affects the adaptability of BI. Therefore, we conducted a case study at a globally operating
German sportswear designer and manufacturer who implemented an architectural switch from disk-
based databases (DRDB) to IMDB. To accompany that project, we address the following research
questions:
Do requirements of BI agility negatively interact with the common DWH-based BI approach?
Does the usage of IM technology affect the agility of BI at the surveyed organization?
To achieve a common theoretical foundation we first give a background on DHW-based BI and IMDB
and highlight the value of agility in the context of BI. Afterwards, we introduce our research approach
and the case study setting. The fourth section explains the data collection process in detail. Next, we
present the results of our study and their interpretation before considering its limitations. In the last
section, we describe our contribution as well as an outlook to future research opportunities.
2 THEORETICAL FOUNDATION
2.1 Data Warehouse-based BI and In-Memory Databases
BI can be defined as “a broad category of applications, technologies, and processes for gathering,
storing, accessing, and analyzing data to help business users make better decisions” (Watson 2009). It
is an umbrella term for systems and processes that turn raw data into useful information (Chen and
Siau 2012; Wixom and Watson 2010). Most multidimensional BI systems, particularly in
organizations with several source systems, utilize the DWH approach to systematically extract,
harmonize and provide data to reflect the organization’s single point of truth (Kimball and Ross 2002;
Rifaie et al. 2008; Watson 2009; Watson and Wixom 2007). A DWH is built to fulfill fundamental
requirements (Inmon 1996), i.e. integration, subject-orientation, time-variance and non-volatility. It
usually consists of several layers that physically store data if based on DRDB. Data is extracted from
source systems, transformed and loaded into the DWH. This process is called ETL-process (extract,
transform and load). The data is further cleansed, harmonized and consolidated inside the DWH as
single source of truth of an enterprise. To meet application-specific requirements the “general” data
can be enriched with business logic before made available for analysis and reporting. Data is usually
aggregated DWHs based on DRDB to meet performance and response time requirements during
analysis operations (Knabke and Olbrich 2011). In addition, many BI tools use a de-normalized
approach (e.g. star schema) (Kimball 1996) which allows for efficient read operations on big data
volumes.
Although data can be cached in the main memory of a DRDB system, it needs to be processed and
stored in several layers and the primary storage location remains a magnetic hard disk. Instead, an
IMDB keeps its data permanently in main memory of the underlying hardware. Main memory is
directly accessible by the CPU(s) and the access is orders of magnitudes faster (Garcia-Molina and
Salem 1992). Due to recent price reductions for main memory and the usage of dedicated compression
techniques it is now possible to even hold the entire data of large-size companies in-memory (Plattner
and Zeier 2011). IMDB-based BI infrastructures use column-oriented data storage to optimally
support online analytical processing (OLAP) applications like BI. Column-oriented storage also allows
for better suited compression techniques and gains huge performance impacts – up to factor 1000 with
real-life data (Plattner 2009).
2.2 Agility and its Value for BI
The idea of organizational agility has been established in practice and discussed in literature for
decades and is not limited to IS. It originated from the field of manufacturing (Pankaj et al. 2009; van
Oosterhout et al. 2007) and has also been used for several years in different management areas, such
as corporate performance management or supply chain management. In business literature, it drew
mainstream attention through the work of Goldman et al. (1991) with regard to “agile manufacturing”.
Nevertheless, the definition of agility is ambivalent in scientific literature and industry (McCoy and
Plummer 2006; van Oosterhout et al. 2007). While the term “agile” is described as the ability “to move
quickly and easily” by Oxford Advanced Learner’s Dictionary (Hornby and Cowie 1989, p. 23),
researchers have provided a wide range of definitions, often with deficiencies in the academic
approach to arrive at these definitions (Pankaj et al. 2009). In contrast, Conboy and Fitzgerald
(Conboy and Fitzgerald 2004b) conducted a cross-discipline literature review to derive a holistic
definition of agility. In particular, they investigated the underlying concepts of agility, i.e. flexibility
and leanness (Conboy 2009; Sharifi and Zhang 1999; Towill and Christopher 2002). They define
agility as the continual readiness of an entity to rapidly or inherently, proactively or reactively,
embrace change, through high quality, simplistic, economical components and relationships with its
environment(Conboy and Fitzgerald 2004a, p. 110). This definition is in line with the definition of
Pankaj et al. (Pankaj et al. 2009) who stated that agility must respect the abilities to sense a change,
diagnose a change as well as select and execute a response to a change in real-time. However, real-
time does not necessarily mean a very short amount of time, e.g. seconds. Instead, the actual physical
length of time is dependent on the context of the IS and may differ for strategic, tactical and
operational IS (Marjanovic 2007; White 2005) as they support business processes of different time
frames. For instance, the time frame for strategic processes may range from months to years (Pankaj et
al. 2009).
To get an understanding of agility in a BI context, we follow the framework suggested by Knabke and
Olbrich (2013). We believe this framework suits our research very well as the authors analyzed the
concept of agility in IS in a structured literature review and mapped their findings to the domain of BI.
As a result of their analysis they grouped similar constructs of BI agility as illustrated in Figure 1.
These agility dimensions are briefly explained in the following:
Change Behavior is a central construct of agility and describes the behavior of BI with regard to
change. Thus, a system can behave reactively, proactively, create or even learn from change.
Perceived Customer Value (PCV) highlights the importance of quality, simplicity and economy as
values for the customer of BI.
Time describes the ability of BI to adapt to changing environments over time. This can either happen
in a continuous process or on an “ad-hoc” basis. The actual physical length of time is dependent on the
context of the IS and may differ for strategic, tactical and operational IS.
Process comprises the ability of BI to sense, analyze and respond to a change. Agile BI should support
methodologies and organizational structures to be able to quickly respond to changing requirements.
Model incorporates the architecture of BI. Agile BI may even require a new architectural approach
which is among others, reusable, reconfigurable and scalable.
Approach describes the process method that is used in BI projects, e.g. traditional models such as
waterfall models or agile methods like Scrum.
Technology considers the underlying technology of BI. This may e.g. be a DRDB or an IMDB.
Environment of BI can be interpreted in multiple ways such as business processes, people, customers,
clients, or formalities. It respects the fact that the need for change often arises outside the IS.
Figure 1. Framework for BI agility
Knabke and Olbrich (2013) identified the BI creation process, i.e. approach, as one aspect of BI
agility. Unfortunately, the observed organization did only focus on a technology shift and did only use
one project approach, which was a traditional one. Therefore, we neglect the dimension approach in
our study as we lack data to be analyzed. Nevertheless, to the best of our knowledge, no study
analyzes how a technological advancement like IMDB affects BI in terms of agility in a real-life case.
Thus, our study attempts to fill this research gap by analyzing how IM-based BI behaves according to
the agility dimension above.
3 RESEARCH DESIGN – CASE STUDY RESEACH
3.1 Case Study Research
We decided for case study research to determine how IM technology in particular affect BI agility.
According to Yin (1994) a case study is “an empirical inquiry that investigates a contemporary
phenomenon within its real-life context, especially when the boundaries between phenomenon and
context are not clearly evident” (Yin 1994, p. 13). We believe that the method of case study research
is well-suited for our problem for several reasons (Dubé and Paré 2003; Yin 1994). First, case study
research provides a way to analyze the impact of technology on BI agility in a natural setting of a real-
life use case without exerting control over the process or setup of the BI transformation project using
IM technology. Second, since the BI agility phenomenon we are investigating cannot be separated
from the BI transformation project, the boundaries between phenomenon and context are not clearly
obvious. Third, we use multiple source of evidence, namely qualitative and quantitative data.
Our case study research method consists of three steps (Dubé and Paré 2003). The first step is research
design. It refers to the attributes associated with the design of the study” (Dubé and Paré 2003,
p. 605). Data collection as the second step describes the quality of the data collection process
including the data collection methods (qualitative and quantitative). The third step, data analysis, is
concerned with the process description, the use of techniques as well as modes of data analysis.
3.2 Case Study Design: The Quest for Agile BI at a German Sportswear Label
The case study was conducted at a globally acting sportswear designer and manufacturer with several
billion in revenue and a profit of several 100 million €. The company with headquarter based in
Germany employed more than 50,000 people in 2014 and is one of the biggest sportswear designers
and manufacturers in the world. Its industry has seen some rapid change and fierce competition in
recent years. Hence, the company wants to react to market changes earlier than its competitors and
introduce agile analytics. In a global BI transformation project the organization aimed to consolidate
and transform all global BI applications from different DRDB landscapes into one single, IMDB-
based platform. The initiative that started at the end of 2012 and yields to offer high performance
reporting and business analytics capabilities based on a consolidated and harmonized data basis with a
globally agreed understanding of data structures and key figures (“one version of the truth”). The BI
transformation has a high profile within the organization as BI is seen as a main driver to turn data into
business relevant information. The initiative sets the foundation for several advanced business
scenarios to allow even more fact-based decision making. Examples of these advantages are big data
analytics, predictive analytics or integrated business planning which were technically not feasible
without the transformation program. The program is planned to be finalized completely in 2015.
Nevertheless, some major BI applications have been set live what justifies an in-depth scientific
analysis at this time.
The former BI landscape consisted of two major DWH-based BI systems. One DWH system
contained all retail-related data. All other BI-related information (manufacturing, finance etc.) was
stored in a second DWH system. Both systems are based on DRDB with a necessary performance add-
in to meet at least minimum performance requirements. According to the BI responsible no reporting
and querying was possible without this additional performance infrastructure (analytical database in
Figure 2). In the new landscape only one DWH-based BI system exists which is completely based on
an IMDB. The overall system landscape is shown in Figure 2.
BI
Transformation
Initiative
DRDB BI Landscape
Global BI Repo rting and Analysis
Global Source S ystems Local Source Systems
Global DWH 1 Global DWH 2 Local DWHs
Local BI Tools
IMDB BI Landscape
Global BI Repo rting and Analysis
Global Source S ystems Local Source Systems
Global DWH
DRDB DRDB DRDB IMDB
Local Schemas
Local BI add ons
Analytical Database
IMDB
Figure 2. System landscapes overview
3.3 Unit of Analysis
The phenomenon under investigation is the impact of IM technology on the agility of BI. As the
organization utilizes a DWH as the basic concept for its BI activities, we use the criteria for DWH
constituted by Inmon (1996) as a starting point of our analysis. Inmon claimed that the integration of
data from (diverse) sources ensures consistency and yields a single point of truth. Second, BI elements
should be organized according to the subject areas of the organization (subject-orientation).
Structures in a DWH need to contain a connection to time to show changes over time (time-variance).
Fourth, data in the DWH should never be altered (non-volatility). Based on the evidence of high BI
project failure (Chenoweth et al. 2006; Hwang and Xu 2005; Joshi and Curtis 1999; Olszak 2014; Shin
2003), we assume that the fundamentals of DWH-based BI as operated currently, i.e. based on DRDB,
contradict the requirements of today’s agile environments. This indicates the importance of agile BI.
Change
Behavior
Perceived
Customer Value
Time
Process
Model
BI Agility
Dimensions
DWH/BI
Characteristics
Non-Volatility
Subject-
Orientation
Time-Variance
Integration
Technology
In-Mem ory
Databases
Environment
Disk-Resident
Databases
Figure 3. Unit of analysis
To identify the impacts of DWH-based BI on BI agility we propose our research model as depicted in
Figure 3. Each DWH-based BI characteristic (Inmon 1996) may influence BI agility (Knabke
and Olbrich 2013). As an exemplary impact, integration may affect BI agility in terms of model. As
we primary aim to analyze the impact of IM technology on BI agility in our study, the utilized
technology is a central construct of our unit of analysis. Therefore, we include technology as
moderator in our research model. In a moderator effect the impact of an independent variable (here
DWH/BI characteristics) on an outcome variable (here BI agility) depends on a third variable, the
moderator variable (here technology) (Hayes and Matthes 2009). Taking the moderator into account,
the above stated example can be extended so that the impact of integration on BI Agility in terms of
model is influenced by technology.
4 DATA COLLECTION
Different sources of evidence, e.g. qualitative and quantitative data, can be used to investigate the unit
of analysis to provide a broad picture of the phenomenon of interest (Dubé and Paré 2003; Yin 1994).
Hence, we followed a five-step approach to obtain the necessary data as illustrated in Figure 4. It
integrates two data collection techniques as well as qualitative and quantitative data. First, we
developed a survey-based questionnaire and scrutinized it with a group of researchers in our institute.
The questionnaire was developed following the rules of Dillman (Dillman 1978; Dillman 2000;
Dillman 2000; Dillman et al. 2009). The questionnaire contains each question for the new (IMDB) and
old (DRDB) BI landscape. Additionally, it includes control questions (Bhattacherjee 2012). Second,
we conducted six semi-structured individual expert interviews with six experts at the company of the
case study. Within these sessions we verified our research model and the questionnaire. The experts
have been chosen from business and IT (three each) to cover both points of view. We selected them as
they play a major role in the BI transformation project. The group of experts consisted of project
managers as well as responsible functional and technical stakeholders. According to the results of the
interviews we adapted our research model and the questionnaire in a third step. Step four contained the
conduction of a structured, self-administered survey (Leeuw et al. 2008). The questionnaire was
available on the web within a period of three weeks for a closed group invited via email. For each
variable, e.g. perceived customer value, the participants rated a set of four to ten statements. The
statements “The old BI systems were easy to understand and use.” and “The new BI system is easy to
understand and use.” are one example of the dimension perceived customer value. The answers
consisted of non-dichotomous 7-point Likert scales. The participants’ responses can be aggregated in a
standardized manner and used for quantitative analysis with this study approach (Bhattacherjee 2012).
We evaluated the answers of the survey using quantitative (statistical) methods (see section 5). We
discussed the findings of this analysis with five experts in the last step (one business expert was not
available due to holiday season) to find explanations for (especially unforeseen) results (see section 6).
Figure 4. Data collection process
The results of the quantitative evaluation (step four in Figure 4) have been achieved by using a gradual
approach that is illustrated in Figure 5. First, we coded the answers of the interviewees to numeric
values. The 7-point Likert scale answers from “strongly disagree” to “strongly agree” were coded with
“-3” to “+3”. Afterwards, we calculated basic survey statistics such as the distribution of IT and
business participants. In step three we conducted an analysis of correlation within the BI
characteristics proposed by Inmon (1996). Fourth, we analyzed the relation between BI characteristics
and BI agility without distinguishing between the used technologies DRDB and IMDB. Last, in
pursuit of our second research question, we looked at the impact of BI characteristics on BI agility and
how the moderator variable technology affects this relation. For step three we used the standard
statistical method of analysis of correlation. For the quantitative analysis of step four and five in
Figure 5 we applied partial least squares (PLS) as an established mathematical procedure (Chin 1998;
Kline 1998; Rönkkö et al. 2012) to identify the path coefficients (PC). Various approaches exist to test
the existence of moderation depending on the scale of the moderator effects (Henseler and Chin 2010).
Based on the setting of the use case we used a pragmatic approach to determine the moderator effect
of the variable technology. Since the assumed moderator is dichotomous, i.e. only has two values (old
BI system, new BI system), a moderating effect can be assessed by a group comparison. Therefore, we
calculated the difference between the PCs for the old and new model. This difference can be
interpreted as the moderating effect of the variable technology (Henseler and Fassott 2010).
Figure 5. Steps of the quantitative survey evaluation
Both, the BI characteristics as well as the dimensions of agility are not directly measurable and were
therefore modelled as latent variables (Backhaus et al. 2013). Thus, we assigned indicators to each
latent variable. Statements in the questionnaire represent these indicators. Participants of the survey
rate their agreement according to the statements. Each of the latent variables is then determined as a
weighted linear combination of its indicators. The dependent latent variables are a weighted linear
combination of the independent latent ones (Backhaus et al. 2013). We choose PLS because it can
cope with small sample sizes as well as non-normal data and is adequate for exploratory research (Hair
et al. 2011). It is advisable to check for outliers to ensure reliable data quality (Weiber and Mühlhaus
2010). Therefore, we used an agglomerative clustering technique called single-linkage or nearest-
neighbor respectively (Backhaus 2011). Such algorithms initially treat every data set as a cluster and
then start reducing the number of clusters by joining them until only one cluster containing all data
sets is left. The determination of the outliers was done by using a rule of thumb which has been proven
in practice (Backhaus 2011). For the quantitative analysis we used the software tools SPSS Statistics
Version 22 (IBM 2013) and SmartPLS Version 3.1.3 (SmartPLS 2014).
5 DATA ANALYSIS AND RESULTS OF THE QUANTITATIVE
EVALUATION
The email distribution list contained 187 persons. 69 of the invited persons accessed and started the
survey (36.9%). 43 of them completed the questionnaire, i.e. 62.3% of the participants or 23% of all
invited persons. By checking the outliers as mentioned above 39 participants remained (20.9% of the
invitees). As every participant was asked questions regarding the old and new landscape we achieved a
total of 78 answer sets. 14 of the 39 participants (35.9%) have been users from functional or business
departments, 24 (61.5%) were from IT and one (2.6%) answered “other”. Regarding their involvement
in the BI transformation, three participants (7.7%) were business project members. 13 were IT project
members from the organization itself (33.3%) and another 11 were external project members (28.2%) .
12 participants (30.8%) were end-users with no project involvement.
We analyzed the answer sets (n=78) without differentiating between the underlying technologies
(IMDB and DRDB) to determine correlations within the DWH characteristics proposed by Inmon
(1996). All four variables, i.e. subject-orientation, integration, time-variance and non-volatility,
correlate positively and no negative correlations exist. The coefficients are between 0.38 and 0.60.
Values below 0.9 are deemed acceptable as very high correlation would question the definition of the
latent variables (Huber 2007).
5.1 Dependencies between BI Characteristics and Agility
Table 1 (n=78) summarizes the dependencies between DWH/BI characteristics and BI agility without
moderator distinction. Considering four variables within DHW-based BI characteristics and six
variables in BI agility, this sub model contains 24 relations. The table shows the PCs between
DWH/BI characteristics and the agility of BI. PCs are the weights of the paths obtained by regressions.
Every PC should at least have an absolute value of 0.1 (Huber 2007). Four of the 24 PCs did not meet
this threshold. In return, 83.3% fulfill the criterion. We applied bootstrapping to assess the significance
of the paths. At a significance level of 10%, the resulting t-values (t) should exceed 1.65 (Huber
2007). As shown in Table 1, 41.7% (10 of 24) of the relations are significant at a 0.1 level and meet
the criterion.
Change
Behavior
PCV Time Process Model Environ-
ment
PC t PC t PC t PC t PC t PC t
Subject-Orientation .02 0.08 .33* 1.90** .29* 1.89** .00 0.05 .13* 0.07 .12* 0.83
Integration .11* 0.81 .17* 1.13 .06 0.50 .19* 1.86** .10* 0.53 .16*1.09
Time-Variance .14* 0.65 .08* 0.46 .05 0.36 .32* 2.45** .24* 1.27 .36* 2.22**
Non-Volatility .47* 2.75** .29* 1.68** .33* 2.48** .40* 2.96** .33* 2.08** .10* 0.61
Notes: * Path coefficient above threshold (0.10)
** Significant t-value, i.e. t-value above threshold (1.65)
Table 1. Path coefficients between DWH-based BI characteristics and BI agility
5.2 Impacts of Technology on the Agility of BI
We split the data set (n=78) along the lines of the variable technology as the moderator is
dichotomous. This results in two groups of n=39 responses each (DRDB and IMDB) and facilitates the
assessment of a moderating effect. Table 2 illustrates the PCs of the PLS procedure for these separate
estimates. The impact of the moderator of the variable technology is the difference (Diff) between the
new (IMDB) and new (DRDB) landscape (Henseler and Fassott 2010). Although all paths with an
absolute coefficient of 0.10 or higher can already be included in moderating considerations (Huber
2007), we chose a cut-off value 0.25 to reflect our more conservative approach.
Change
Behavior
Perceived
Customer Value
Time Process Model Environ-
ment
Subject-
Orientation
DRDB -0.25 0.14 0.15 -0.22 0.05 -0.03
IMDB 0.07 0.43 0.47 0.08 0.40 0.31
Diff 0.31*+ 0.29* 0.33*+ 0.31*
+ 0.35* 0.35*+
Integration
DRDB 0.31 0.04 -0.07 0.25 0.01 0.21
IMDB 0.24 0.39 0.01 0.23 -0.10 0.24
Diff -0.07 0.35* 0.08 -0.02 -0.11 0.03
Time-
Variance
DRDB 0.25 0.20 0.07 0.36 0.21 0.41
IMDB -0.10 0.20 0.07 0.23 0.40 0.22
Diff -0.35** 0.00 0.00 -0.13 0.19 -0.19
Non-
Volatility
DRDB 0.44 0.38 0.43 0.57 0.45 0.17
IMDB 0.58 -0.02 0.27 0.45 0.23 -0.03
Diff 0.14 -0.41**+ -0.16 -0.12 -0.22 -0.20
Notes: * significant positive moderating effect (difference >= +0.25)
** significant negative moderating effect (difference <= -0.25)
+ Deviation in Diff due to rounding
Table 2. Comparison of path coefficients as a moderator analysis
In total, nine moderated paths were identified (37.5%). In seven cases (29.2%) the BI transformation
appears to enhance the relationship between DWH-based BI characteristics and agility. In contrast, in
two cases (8.3%) the relationship seems to have deteriorated. The results have been analyzed in a
second round of expert interviews and are described in section 6.
5.3 Technical Considerations
In addition to the quantitative evaluation of the survey data we conducted a technical analysis of the
BI transformation. The former BI landscape consisted of two BI systems, whereas the new IMDB-
based BI system is implemented in one system (see Figure 2).The transformation project reduced in
the number of objects and hardware used. For the old DRDB systems 32 servers were needed whereas
the new system requires 12 servers. The database size dropped from 39.5 terabytes (TB) to
4.6 TB (-88.2%). The number of objects used for reporting and data analytics including the underlying
architecture was reduced from 35,153 to 5,517 or -84.5%. But, one has to keep in mind that the BI
transformation project is not completely finalized yet. Thus, the storage space consumed as well as the
number of objects will possibly rise, but be less than before. Performance as well as flexibility
increase of the BI application played a major role at the organization when deciding to implement the
transformation program. This was not only done by migrating BI application and its data to an IM-
based technology. Moreover, the program consolidated data models and adapted business logic within
the applications. This complicates the process selection for a performance analysis to be able to get
reliable and comparable results. The process recommended as comparable by the organization is a
sales and margin reporting. This application enables users to do general analysis on sales and margin
figures, e.g. from gross sales down to standard margin by customer, article, reporting unit, period and
further dimensions.
The information logistics process at the studied organization consists of several layers as illustrated in
Figure 2 and is typical for DWH-based BI. First, data is extracted from the source systems to the
extraction layer. Second, the data is cleansed, consolidated and harmonized and afterwards enriched
with application or functional specific logic (corporate and propagation layer). If a DWH is based on
DRDB, usually a reporting layer for performance reasons is required (data mart layer). For the
performance view on this technical consideration we analyzed the end-to-end process of sales and
margin reporting from data extraction from the source systems until the data is available for analysis.
Overall, the process to provide data for reporting and analysis decreased from 01:51 hours to 9
minutes (-92%) for a comparable data set (~0.2 million records). The DRDB system required a data
mart layer and additional technical infrastructure for performance reasons. This layer is not necessary
in the new landscape (-100%).
Process Step / Data Flow DRDB BI systems
(hh:mm:ss)
IMDB BI system
(hh:mm:ss)
Difference
Data Mart Layer 01:41:31 00:00:00 -100%
Corporate and Propagation Layer 00:08:05 00:08:36 +6%
Acquisition Layer 00:01:44 00:00:40 -62%
Overall 01:51:19 00:09:17 -92%
Table 3. Exemplary ETL-process comparison between the old BI and new BI landscape
6 IMPLICATIONS AND REFLECTION IN SEMI-STRUCTURED
EXPERT INTERVIEWS
6.1 Interpretation
Referring to our first research question we identified Inmon’s criteria to be complementary factors.
They correlate positively and do not contradict themselves. Hence, following all of Inmon’s criteria
does not result in any disadvantageous effects for the DWH approach in practice. Regarding the
conflict of agility requirements with the common DWH-based BI approach it seems to depend on the
underlying variables and the underlying technology. All PCs are positive with ten of 24 significant
when looking at the complete answer set (Table 1). This would object our first research question. But,
the first impression changes if the DRDB and IMDB landscape are considered separately. For the
DRDB landscape four negative relations exist, with two PCs below -0.1. For the IMDB-based BI
system also four negative PCs exist, but with all of them between -0.1 and -0.02. Although this is only
a weak indication, it seems that the basic DWH assumptions have a stronger negative impact on BI
agility if based on DRDB.
IMDB technology has a significant positive impact on nearly a third of the relations (seven out of 24
or 29.2%) between the basic DWH assumptions and the agility of BI according to Table 2. It has a
positive moderating effect on the influence of subject-orientation on every agility dimension of BI.
The differences in PCs were significantly positive for all of the six paths. The development of the new
BI system was driven by key business players that enabled the IT department to ensure as many
business benefits as possible. The IM technology enables the implementation of metrics and KPIs that
were not realizable before due to performance limitations. The experts mentioned the simplicity and
reduced time to achieve changes in the IMDB-based BI system as one of the main achievements of the
BI transformation. Another new feature are drill maps which support the end-user to better analyze the
data and find root causes for changes in important KPIs. The underlying IMDB technology allows for
on-the-fly calculations and thereby increases customer value and business benefits. In addition, the
experts agreed with the participants’ opinion that the new BI system allows for a higher level of self-
service. They asserted that this is a combination of the system consolidation, the computing power of
the IMDB and the choice of the frontend-tool. A non-technology related reason might be the fact that
people are still learning how to exploit all the possibilities of ad-hoc reporting. Overall, the
interviewed experts mentioned that the transformation has in fact materialized in a boost of agility.
The effect of integration on perceived customer value also has improved remarkably by using IM
technology. Besides the positive influence of the new technology itself, some experts highlighted the
effect of the system consolidation into one database as there is now a single point of truth since all
information is now combined at one place and consistent. This simplifies the creation of reports which
span across some of the formerly separated data and might therefore even drive changes in the
reporting culture. The BI personnel finds itself having formerly unknown possibilities in reporting by
merging data. This creation of functional spanning reports in the DRDB systems could only be
achieved by time- and effort-consuming transformations which in itself lead to redundancies.
Moreover, the IMDB-based BI enables the storing of local master data which contributes to both, the
level of self-service and the possibility of ad-hoc changes since there is no need for global agreement
before a change in local master data can be decided.
In two of 24 cases the impact of IMDB is negative (8.3%). Time-variance negatively influences the
change behavior of the BI system. One explanation of the experts for the negative effect was that the
number of snapshots were reduced in the new BI landscape. This limits the possibility of getting more
insights into business backgrounds such as backtracking delivery statuses and its reasons for backlogs.
Summarizing, the negative effect was not caused by the IM technology itself but by decisions to
reduce the scope of the BI system.
The BI transformation had negative consequences for the effect of non-volatility on the perceived
customer value. Although non-volatility seems to support the perceived customer value in the old BI
landscape, this effect disappeared when moving on to the new BI landscape. Confronted with this fact,
the reactions of the experts were again mixed. One expert stated that the current state of the data might
temporarily be more volatile. Another reason that both the IT and the business expert from one
department gave is related to the explanation for the negative impact of time-variance and change
behavior. Snapshots have been reduced from a daily to a weekly basis. This drastically limits the
ability to track and analyze fulfillment rates of deliveries to customers over time, which is an
important KPI in the organization of the use case. Again, the negative impact is more related to a
business decision after weighing the necessity of daily snapshots against storage expenses than to the
capability of technology to moderate BI agility.
6.2 Evaluation of the Methodical Problems in the Study of a Single Case
Lee (1989) identified four problems that are relevant for case study research in management
information systems. In particular, he highlighted the challenges for case study research of a single
case in a natural, real-world setting. These are the problems of making controlled observations,
making controlled deductions, allowing for replicability as well as allowing for generalizability.
We evaluated a single running artifact in our case study. As one part of our analysis we used
quantitative analysis by means of statistics. As natural in MIS case study research we do not have
controls as found in laboratory experiments. But, we used statistical controls (e.g. control variables) to
achieve controlled observations. Second, the evaluation of our questionnaire used quantitative
statistics. As these are using mathematics as subset of formal logic, we achieved the second claim of
Lee which is making controlled deductions. Third, the postulation for replicability is hard to achieve
for MIS case studies in a real-world setting as it is very unlikely to observe the same setting of
individuals, groups, social structure, hardware or software again in the same way (Lee 1989). But, the
same theories could be applied to a different set of initial conditions. Although observations are hardly
replicable in MIS case studies, its findings are replicable (Lee 1989). Our presented case study is
based on the BI transformation of two DRDB-based BI landscapes into one system based on an
IMDB. A DWH builds the basis for BI in both environments. These settings are not limited to the
considered organization but can be repeated at other companies with a different set of initial
conditions. The theory assumption that IMDB enables BI agility can be transferred to different
organizations, too. As the focus lies on IMDB and DWH, the existence of a two to one system
transformation can even be neglected. Thus, the case study’s findings, i.e. proofing or disproving our
theory, are replicable. Last, the major design assumption in our case study is the existence of a DWH-
based BI system which is transformed to an IMDB. The company of the case study was a large-size
sports fashion company based in Germany. Yet, the architecture of the BI system using a several layer
approach is typical for DWH-based BI, regardless of size, branch or culture of the organization. That
being said, organizational processes and culture of decision making certainly may impact design
decisions. Nevertheless, our theory is generalizable if it is confirmed by additional case studies in a
variety of situations. Thus, we are convinced that we made controlled observations and deductions in
our study that analyzes a single running artifact and the core results are replicable and can be
generalized.
6.3 Limitations of the Study
The results must be carefully reflected in the light of the study’s limitations. The weaknesses of the
model as explored in the evaluations might well be attributed to the relatively small sample size (n=39
and thus 78 answer sets in total). The results of a bigger survey among people from different industries
might differ from the ones presented above. Obviously, a higher number of participants and resulting
data sets would have been desirable and might have stabilized the results, but lamentably could not be
realized due to the limited number of potential participants. However, this case study serves as a
starting point as the intensive usage of IMDB as technology basis for BI is just beginning in an
organizational context. Companies are just starting to explore the full capacity and we assume that
positive impacts will even rise in the future. The negative moderating effects in our case study are only
partially connected to the introduction of the in-memory technology as confirmed by the experts
during the interviews. Limitations for end-users are not necessarily a consequence of the technology,
but rather a compromise which was made as a management decision for monetary reasons. Regarding
the performance and technical investigation we need to point out that the BI transformation is not
finalized in terms of migrating applications to the new landscape. This may have an impact of the
utilized hardware. But, continuous improvements is an often found phenomenon in IS and referred to
as design cycles in design science. Comparing end-to-end process is hardly possible in every detail.
Nevertheless, our examination delivers an insight into the order of magnitude that IMDB offers in
terms of performance gains. The type of study may also be limiting: An observational study like this is
not able to control confounding variables like the usage of agile process methods and could bias the
findings.
7 CONTRIBUTION AND OUTLOOK
Our overall research goal is to contribute to the field of agility in the context of BI. We consider this
discussion as highly valuable, since we assumed that the underlying requirements of DWH and thus BI
(Inmon 1996) contradict the agility framework introduced above and hence, the agility of BI in
general. Therefore, we evaluated our design assumptions on how agility in decision support systems
can be achieved through a case study analysis of a German globally operating sportswear label.
Though limited to observations in a single case study, the results indicate that some criteria for
building DWHs as well as the agility criteria seem to be complementary and positively correlated. But
above all, our analysis showed that IM technology appears to be a technology enabler for agile BI.
Referring to our second research question, some DWH-based BI characteristics get positive effects on
BI agility if IM technology is used or the impact is only positive when using IM. Moreover,
practitioners and other organizations will benefit first-hand from these concepts. For instance, subject-
orientation positively influenced all BI agility dimensions by using IM technology. IMDB enables the
realization of KPIs that could not be implemented with former technologies. In addition, IM
technology achieves simplicity and reduced time and effort for change requests at our case study
organization. This will move decision support by BI systems away from solely dispositive, analytical
support with historic reflections to actively steer the future using data already available in companies.
Besides the positive integration effect of the system consolidation, BI customers can now conduct
functional spinning analysis. Moreover, IMDB-based BI contributes to a higher level of self-service
and ad-hoc analysis. The gain in perceived customer value will affect the way BI supports the
enterprise organizations’ decision process as well as the corporate strategies, resulting in contribution
to corporate success.
In our prospective research agenda we will pick up on the limitations of the study. Next, experts
(scientists and practitioners) at The Data Warehouse Institute (TDWI) will be surveyed. This ensures
industry-spanning responses from BI practitioners, consultants as well as scientists to validate our
results. It will also mitigate ambiguous statistical results due to targeting a heterogeneous group or a
bigger sample size. Our future research will also extend the research by including the analysis of
approach (e.g. agile process methods) as potential moderator to achieve BI agility.
References
Backhaus, K. (2011). Multivariate Analysemethoden: Eine anwendungsorientierte Einführung ;
[Extras im Web]. 13., überarb. Aufl. Springer-Lehrbuch. Heidelberg [u.a.]: Springer.
Backhaus, K., Erichson, B. and Weiber, R. (2013). Fortgeschrittene multivariate Analysemethoden:
Eine anwendungsorientierte Einführung ; [mit Extras im Web]. 2., überarb. und erw. Aufl.
Springer-Lehrbuch. Berlin [u.a.]: Springer Gabler.
Bhattacherjee, A. (2012). Social Science Research: Principles, Methods, and Practices. 2nd ed. Book
3: Open Access Textbooks.
Caruso, D. (2011). Bringing Agility to Business Intelligence. Url: http://www.information-
management.com/infodirect/2009_191/business_intelligence_metadata_analytics_ETL_data_mana
gement-10019747-1.html (visited 04/13/2012).
Chen, H., Chiang, R. H. L. and Storey, V. C. (2012). Business Intelligence and Analytics: From Big
Data to Big Impact. MIS Quarterly, 36 (4), 1165–1188.
Chen, X. and Siau, K. (2012). Effect of Business Intelligence and IT Infrastructure Flexibility on
Organizational Agility. In: Proceedings of the International Conference on Information Systems,
ICIS 2012, Orlando, Florida, USA, December 16-19, 2012.
Chenoweth, T., Corral, K. and Demirkan, H. (2006). Seven Key Interventions for Data Warehouse
Success. Communications of the ACM, 49 (1), 114–119.
Chin, W. W. (1998). The Partial Least Squares Approach to Structural Equation Modeling. In: Modern
Methods for Business Research (pp. 294–336). Ed. by G. A. Marcoulides. Hillsdale, NJ: Lawrence
Erlbaum Associates.
Conboy, K. (2009). Agility from First Principles: Reconstructing the Concept of Agility in
Information Systems Development. Information Systems Research, 20 (3), 329–354.
Conboy, K. and Fitzgerald, B. (2004a). Toward a Conceptual Framework of Agile Methods. In:
Lecture Notes in Computer Science. Extreme Programming and Agile Methods - XP/Agile
Universe 2004. 4th Conference on Extreme Programming and Agile Methods, Calgary, Canada,
August 15-18, 2004. Proceedings (pp. 105–116). Ed. by C. Zannier, H. Erdogmus, & L. Lindstrom.
Berlin / Heidelberg: Springer.
Conboy, K. and Fitzgerald, B. (2004b). Toward a Conceptual Framework of Agile Methods: A Study
of Agility in Different Disciplines. In: Proceedings of the 2004 ACM workshop on
Interdisciplinary software engineering research, pp. 3744.
Davison, J. and Burt, M. (2014). Top Retail Business and Technology Trends. Url:
https://www.gartner.com/doc/2727618/top-retail-business-technology-trends (visited 03/16/2015).
Dillman, D. A. (1978). Mail and telephone surveys: The total design method. A Wiley-Interscience
publication. New York: Wiley.
Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. 2nd ed. New York, NY:
Wiley.
Dillman, D. A., Smyth, J. D. and Christian, L. M. (2009). Internet, mail, and mixed mode surveys: The
tailored design method. 3rd ed. Hoboken, NJ: Wiley.
Dubé, L. and Paré, G. (2003). Rigor in Information Systems Positivist Case Research: Current
Practices, Trends, and Recommendations. MIS Quarterly, 27 (4), 597–635.
Evelson, B. (2011). Trends 2011 And Beyond: Business Intelligence. Agility Will Shape Business
Intelligence For The Next Decade. Forrester Research. Url:
www.forrester.com/go?objectid=RES58854 (visited 04/13/2012).
Gartner. (2011). Gartner Says Solving 'Big Data' Challenge Involves More Than Just Managing
Volumes of Data. Url: http://www.gartner.com/it/page.jsp?id=1731916 (visited 04/11/2012).
Goldman, S., Preiss, K., Nagel, R. and Dove, R. (1991). 21st Century Manufacturing Enterprise
Strategy: An Industry-Led View. Volume 1. Bethlehem, Pa: Iacocca Institute, Lehigh University.
Hair, J., Ringle, C. and Sarstedt, M. (2011). PLS-SEM: Indeed a Silver Bullet. The Journal of
Marketing Theory and Practice, 19 (2), 139–152.
Hayes, A. F. and Matthes, J. (2009). Computational procedures for probing interactions in OLS and
logistic regression: SPSS and SAS implementations. Behavior research methods, 41 (3), 924–936.
Henseler, J. and Chin, W. W. (2010). A Comparison of Approaches for the Analysis of Interaction
Effects Between Latent Variables Using Partial Least Squares Path Modeling. Structural Equation
Modeling: A Multidisciplinary Journal, 17 (19), 82–109.
Henseler, J. and Fassott, G. (2010). Testing Moderating Effects in PLS Path Models: An Illustration of
Available Procedures. In: Springer Handbooks of Computational Statistics. Handbook of Partial
Least Squares (pp. 713–735). Ed. by V. Esposito Vinzi, W. W. Chin, J. Henseler, & H. Wang.
Springer Berlin Heidelberg.
Hornby, A. S. and Cowie, A. P. (1989). Oxford Advanced Learner's Dictionary of Current English. 4th
ed. Oxford: Oxford University Press.
Huber, F. (2007). Kausalmodellierung mit partial least squares: Eine anwendungsorientierte
Einführung. 1. Aufl. Lehrbuch. Wiesbaden: Gabler.
Hwang, M. I. and Xu, H. (2005). A Survey of Data Warehousing Success Issues. Business Intelligence
Journal, 10 (4), 7–14.
IBM. (2013). SPSS Statistics: Version 22. Url: http://www.ibm.com/software/analytics/spss/ (visited
11/25/2013).
Inmon, W. H. (1996). Building the data warehouse. 2nd ed. Wiley computer publishing. New York,
NY: Wiley.
Joshi, K. and Curtis, M. (1999). Issues in Building a Successful Data Warehouse. Information
Strategy, 15 (2), 28–35.
Kimball, R. and Ross, M. (2002). The Data Warehouse Toolkit: The Complete Guide to Dimensional
Modeling. 2nd ed. New York, NY: Wiley.
Kline, R. B. (1998). Principles and practice of structural equation modeling. Methodology in the social
sciences. New York [u.a.]: Guilford Press.
Knabke, T. and Olbrich, S. (2011). Towards agile BI: applying in-memory technology to data
warehouse architectures. In: Innovative Unternehmensanwendungen mit In-Memory-Data-
Management. Beiträge der Tagung IMDM 2011, 2.12.2011 in Mainz, pp. 101–114.
Knabke, T. and Olbrich, S. (2013). Understanding Information System Agility – The Example of
Business Intelligence. In: Proceedings of the 46th Hawaii International Conference on System
Sciences (HICSS'13), pp. 3817–3826.
Knabke, T., Olbrich, S. and Fahim, S. (2014). Impacts of In-memory Technology on Data Warehouse
Architectures - A Prototype Implementation in the Field of Aircraft Maintenance and Service. In:
Advancing the Impact of Design Science: Moving from Theory to Practice - 9th International
Conference, DESRIST 2014, Miami, FL, USA, May 22-24, 2014. Proceedings, pp. 383–387.
Lee, A. (1989). A Scientific Methodology for MIS Case Studies. MIS Quarterly, 13 (1), 32–52.
Leeuw, E. D. de, Hox, J. J. and Dillman, D. A. (2008). International handbook of survey methodology.
EAM book series. New York [u.a.]: LEA, Lawrence Erlbaum Ass.
MacKenzie, I., Meyer, C. and Noble, S. (2013). How retailers can keep up with consumers. McKinsey
& Company. Url:
http://www.mckinsey.com/insights/consumer_and_retail/how_retailers_can_keep_up_with_consu
mers (visited 03/15/2015).
Marjanovic, O. (2007). The Next Stage of Operational Business Intelligence: Creating New
Challenges for Business Process Management. In: Proceedings of the 40th Annual Hawaii
International Conference on System Sciences (HICSS'07).
McCoy, D. W. and Plummer, D. C. (2006). Defining, Cultivating and Measuring Enterprise Agility.
Gartner Research. Url: http://www.gartner.com/id=491436 (visited 04/09/2012).
Moss, L. (2009). Beware of Scrum Fanatics On DW/BI Projects. EIMInsight Magazine, 3 (3).
Mulpuru, S., Johnson, C., Freeman Evans, P. and Naparstek, L. (2014). The New Paradigm Of Retail.
Forrester Research. Url: https://www.forrester.com/The+New+Paradigm+Of+Retail/fulltext/-/E-
RES61138 (visited 03/16/2015).
Olszak, C. M. (2014). Towards an Understanding Business Intelligence. A Dynamic Capability-Based
Framework for Business Intelligence. In: Proceedings of the 2014 Federated Conference on
Computer Science and Information Systems (FedCSIS), pp. 1103–1110.
Pankaj, P., Hyde, M., Ramaprasad, A. and Tadisina, S. K. (2009). Revisiting Agility to Conceptualize
Information Systems Agility. In: Premier Reference Source. Emerging Topics and Technologies in
Information Sytems (pp. 19–54). Ed. by M. D. Lytras & P. Ordóñez de Pablos. Hershey, Pa. [et
al.]: Information Science Reference.
Plattner, H. (2009). A Common Database Approach for OLTP and OLAP Using an In-Memory
Column Database. In: Proceedings of the 35th SIGMOD International Conference on Management
of Data, Providence, Rhode Island.
Plattner, H. and Zeier, A. (2011). In-Memory Data Management: An Inflection Point for Enterprise
Application. Berlin, Heidelberg: Springer.
Redman, T. C. (2008). Data Driven: Profiting from Your Most Important Business Asset. Boston,
Mass: Harvard Business Press.
Rifaie, M., Kianmehr, K., Alhajj, R. and Ridley, M. J. (2008). Data warehouse architecture and design.
In: Proceedings of the IEEE International Conference on Information Reuse and Integration, IRI
2008, 13-15 July 2008, Las Vegas, Nevada, USA, pp. 58–63.
Rönkkö, M., Parkkila, K. and and Ylitalo, J. (2012). Use of Partial Least Squares as a Theory Testing
Tool - an Analysis of Information Systems Papers. In: ECIS 2012 Proceedings.
Schulte, W. R., Chandler, N., Herschel, G., Laney, D., Sallam, R. L., Tapadinhas, J. and Sommer, D.
(2013). Predicts 2014: Business Intelligence and Analytics Will Remain CIO's Top Technology
Priority. Url: https://www.gartner.com/doc/2629220/predicts--business-intelligence-analytics
(visited 02/12/2015).
Schwaber, K. (1997). SCRUM Development Process. In: Business Object Design and Implementation.
OOPSLA '95 workshop proceedings, 16 October 1995, Austin, Texas (pp. 117134). Ed. by J.
Sutherland, P. Patel, C. Casanave, G. Hollowell, & J. Miller. London: Springer.
Sharifi, H. and Zhang, Z. (1999). A methodology for achieving agility in manufacturing organisations:
An introduction. International Journal of Production Economics, 62 (1/2), 7–22.
Shin, B. (2003). An Exploratory Investigation of System Success Factors in Data Warehousing.
Journal of the Association for Information Systems, 4 (1), 141–170.
SmartPLS. (2014). SmartPLS: Version 3.1.6. Url: http://www.smartpls.com/ (visited 11/14/2014).
Towill, D. and Christopher, M. (2002). The Supply Chain Strategy Conundrum: To be Lean Or Agile
or To be Lean And Agile? International Journal of Logistics Research and Applications, 5 (3),
299–309.
van Oosterhout, M., Waarts, E., van Heck, E. and van Hillegersberg, J. (2007). Business agility: Need,
readiness and alignment with IT strategies. In: Agile Information Systems. Conceptualization,
Construction, and Management (pp. 52–69). Ed. by K. C. Desouza. Amsterdam, Boston:
Butterworth-Heinemann.
Watson, H. J. (2009). Tutorial: Business Intelligence – Past, Present, and Future. Communication of
the AIS, 25 (Article 39), 487–511.
Watson, H. J. and Wixom, B. H. (2007). The Current State of Business Intelligence. IEEE Computer,
40 (9), 96–99.
Weiber, R. and Mühlhaus, D. (2010). Strukturgleichungsmodellierung: Eine anwendungsorientierte
Einführung in die Kausalanalyse mit Hilfe von AMOS, SmartPLS und SPSS. Springer-Lehrbuch.
Heidelberg [u.a.]: Springer.
White, C. (2005). The Next Generation of Business Intelligence: Operational BI. DM Review (May
2005).
Wixom, B. and Watson, H. (2010). The BI-Based Organization. International Journal of Business
Intelligence Research, 1 (1), 13–28.
Yin, R. K. (1994). CASE STUDY RESEARCH: Design and methods. 2nd ed. Applied Social
Research Methods Series: v. 5. Thousand Oaks: Sage Publications.
Zimmer, M., Baars, H. and Kemper, H. G. (2012). The Impact of Agility Requirements on Business
Intelligence Architectures. In: 2012 45th Hawaii International Conference on System Science
(HICSS), pp. 4189–4198.
... This results in improved change behavior and higher perceived value for customers. An industry case study at a world's leading sportswear company confirms this finding (Knabke and Olbrich 2015b). Focusing on technology and keeping the education of staff, adequate structures as well as the integration of overall strategies in mind, the free text answers of the survey participants support this observation. ...
Article
Full-text available
The class of business intelligence (BI) systems is widely used to guide decisions in all kinds of organizations and across hierarchical levels and functions. Organizations have launched many initiatives to accomplish adequate and timely decision support as an important factor to achieve and sustain competitive advantage. Given today’s turbulent environments it is increasingly challenging to bridge the gap between establishing a long-term strategy and quickly adopting to the dynamics in market competition. BI must address this field of tension as it was originally used to retrospectively reflect an organization’s performance and build upon stability and efficiency. This study aims to understand and achieve an agile BI from a dynamic capability perspective. Therefore, we investigate how dynamic BI capabilities, i.e., adoption of assets, market understanding, and intimacy as well as business operations, impact the agility of BI. Starting from a literature review of dynamic capabilities in information systems and BI, we propose hypotheses to connect dynamic BI capabilities with the agility to provide information. The derived hypotheses were tested using partial least squares structural equation modeling on data collected in a questionnaire-based survey. The results show that the lens of dynamic capabilities provides useful means to foster BI agility. The study identifies that technological advancements like in-memory technology seem to be a technology enabler for BI agility. However, an adequate adoption and integration of BI assets as well as the focus on market orientation and business operations are crucial to reach BI agility.
Article
Full-text available
Provides a nontechnical introduction to the partial least squares (PLS) approach. As a logical base for comparison, the PLS approach for structural path estimation is contrasted to the covariance-based approach. In so doing, a set of considerations are then provided with the goal of helping the reader understand the conditions under which it might be reasonable or even more appropriate to employ this technique. This chapter builds up from various simple 2 latent variable models to a more complex one. The formal PLS model is provided along with a discussion of the properties of its estimates. An empirical example is provided as a basis for highlighting the various analytic considerations when using PLS and the set of tests that one can employ is assessing the validity of a PLS-based model. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Motivated by recent critique toward partial least squares path modeling (PLS), we present a research question if the PLS method, as used currently, is at all an appropriate tool for theory testing. We briefly summarize some of the recent critique of the use of PLS in IS as a theory testing tool. Then we analyze the results of 12 PLS analyzes published in leading IS journals testing if these models would have been rejected in the case that the data used for model testing had very little correspondence with the theorized models. Our Monte Carlo simulation shows that PLS will often provide results that support the tested hypotheses even if the model was not appropriate for the data. We conclude that the current practices of PLS studies have likely resulted in publishing research where the results are likely false and suggest that more attention should be paid on the assumptions of the PLS model or that alternative approached like summed scales and regression or structural equation modeling with estimators that have known statistical properties should be used instead.
Article
Full-text available
There is a growing use of business intelligence (BI) for better management decisions in different industries. However, empirical studies on BI are still scarce in academic research. This research investigates BI from an organizational agility perspective. Organizational agility is the ability to sense and respond to market opportunities and threats with speed. Drawing on systems theory and literature on organizational agility, business intelligence, and IT infrastructure flexibility, we hypothesize that BI use and IT infrastructure flexibility are two major antecedents to organizational agility. We developed a research model to examine the effect of BI use and IT infrastructure flexibility on organizational agility. Survey data were collected and used to assess the model. The results support the hypothesis that BI and IT infrastructure flexibility are two significant antecedents of organizational agility. This research is a pioneering work that empirically investigates the significance of BI in business context.
Article
Although Business Intelligence (BI) is one of the most essential technologies to be purchased, the implementation of many BI applications fails. The reasons for this failure are not clear and still not well investigated. Resource-based View (RBV) and dynamic capability theory could help to overcome this gap and to provide an appropriate theoretical basis for future research in BI area. It is considered that BI capabilities may be critical functionalities that help organizations to improve their performance and adopt to environmental change. The research objectives for this study are: (1) conceptualization and discussion on BI dynamic capability, (2) building the comprehensive framework of BI capabilities. In order to address these objectives, the remainder of the paper is structured as follows: The first sections provide the theoretical foundations of BI, RBV and dynamic capability theory. Next, the BI capability was conceptualized. Finally, a model of BI as a dynamic capability, was proposed. The study was based mainly on: (1) a critical analysis of literature, (2) an observation of different BI initiatives undertaken in various organizations, as well as on (3) interviews with managers and experts in BI. The results of this study can be used by IT and business leaders as they plan and develop BI capabilities in their organizations.