The design structure matrix (DSM), also called the dependency structure matrix, has become a widely used modeling framework across many areas of research and practice. The DSM brings advantages of simplicity and conciseness in representation, and, supported by appropriate analysis, can also highlight important patterns in system architectures (design structures), such as modules and cycles. A literature review in 2001 cited about 100 DSM papers; there have been over 1000 since. Thus, it is useful to survey the latest DSM extensions and innovations to help consolidate progress and identify promising opportunities for further research. This paper surveys the DSM literature, primarily from archival journals, and organizes the developments pertaining to building, displaying, analyzing, and applying product, process, and organization DSMs. It then addresses DSM applications in other domains, as well as recent developments with domain mapping matrices (DMMs) and multidomain matrices (MDMs). Overall, DSM methods are becoming more mainstream, especially in the areas of engineering design, engineering management, management/organization science, and systems engineering. Despite significant research contributions, however, DSM awareness seems to be spreading more slowly in the realm of project management.
Common method variance (CMV) is the amount of spurious correlation between variables that is created by using the same method-often a survey-to measure each variable. CMV may lead to erroneous conclusions about relationships between variables by inflating or deflating findings. We analyzed recent survey research in IEEE Transactions on Engineering Management, Journal of Operations Management, and Production and Operations Management to assess if and how scholars address CMV. We found that two-thirds of the relevant articles published between 2001 and 2009 did not formally address CMV, and many that did address CMV relied on relatively weak remedies. These findings have troubling implications for efforts to build knowledge within information technology, operations and supply chain management research. In an effort to strengthen future research designs, we provide recommendations to help scholars to better address CMV. Given the potentially severe effects of CMV, authors should apply the recommended CMV remedies within their survey-based studies, and reviewers should hold authors accountable when they fail to do so.
In addition to acquiring external technology, firms have started to actively commercialize technological knowledge, which represents the opposite type of technology transactions. The strong interactions with a firm's environment contrast the traditional closed approaches to innovation. Therefore, this new paradigm has been termed open innovation. Prior research into this field has mostly been limited to theoretical considerations and case studies. More general work has usually focused either on external technology acquisition or on external technology exploitation. Accordingly, I take an integrated perspective and regard the two types of technology transactions as the main dimensions of a firm's strategic approach to open innovation. Drawing on these dimensions, I use data from a questionnaire-based study of 154 middle and large companies to identify groups of firms that pursue homogeneous strategies regarding open innovation. Accordingly, this analysis is the first large-scale study that describes the current state of open innovation in practice. Moreover, it is the first work that identifies strategic approaches of firms to technology transactions along the innovation process.
Energy is an expensive resource that is becoming more scarce with increasing population and demand. In this paper, a mathematical model to minimize energy consumption and reduce total completion time of a single machine is proposed, and a multiobjective genetic algorithm is utilized to obtain an approximate set of nondominated alternatives. Furthermore, dominance rules and a heuristic are proposed to increase the speed of the proposed genetic algorithm. Finally, the analytical hierarchical process is utilized to select a solution with some additional criteria.
Green and lean paradigms have been adopted by companies in order to manage their relationships with suppliers in a supply chain management context, but nearly always separately and with little understanding of their influence on company performance. This paper proposes a theoretical framework for the analysis of the influence of green and lean upstream supply chain management practices on the sustainable development of businesses. To attend this objective, a set of performance measures covering economic (operational cost, environmental cost, and inventory cost), environmental (business wastage, green image, and CO 2 emission), and social (corruption risk, supplier screening and local supplier) perspectives is proposed. An explanatory case study was conducted at a Portuguese automaker to test qualitatively the validity of the proposed theoretical framework. From the case study, a model is suggested, which encompasses the relationships between green and lean upstream supply chain practices and sustainable business development.
In this paper, the existing models of supply chain resilience assessment are extended by incorporating ripple effect and structure reconfiguration. Ripple effect mitigation control is vital for supply chain risk management from positions of structural resilience and recoverability. The research approach is based on a hybrid fuzzy-probabilistic approach. The genome method is applied with the objective of including the structural properties of supply chain design into resilience assessment. A supply chain design resilience index is developed, and its computation and application are demonstrated. The results suggest a method of comparing different supply chain designs regarding the resilience both to disruption propagation and with recovery consideration. It also allows the identification of groups of critical suppliers whose failure interrupts supply chain operation.
Managing and improving organizational capabilities is a significant and complex issue for many companies. To support management and enable improvement, performance assessments are commonly used. One way of assessing organizational capabilities is by means of maturity grids. While maturity grids may share a common structure, their content differs and very often they are developed anew. This paper presents both a reference point and guidance for developing maturity grids. This is achieved by reviewing 24 existing maturity grids and by suggesting a roadmap for their development. The review places particular emphasis on embedded assumptions about organizational change in the formulation of the maturity ratings. The suggested roadmap encompasses four phases: planning, development, evaluation, and maintenance. Each phase discusses a number of decision points for development, such as the selection of process areas, maturity levels, and the delivery mechanism. An example demonstrating the roadmap's utility in industrial practice is provided. The roadmap can also be used to evaluate existing approaches. In concluding the paper, implications for management practice and research are presented.
In recent years, there has been a proliferation of interest in resilience in the supply chain field. Even though literature has acknowledged the antecedents of resilient supply chains, such as supply chain visibility, cooperation, and information sharing, their confluence in creating resilient supply chains where other behavioral issues are prevailing (i.e., trust and behavioral uncertainty) has not been studied. To address this gap, we conceptualized a theoretical framework firmly grounded in the resource-based view (RBV) and the relational view that is tested for 250 manufacturing firms using hierarchical moderated regression analysis. The study offers a nuanced understanding of supply chain resilience and implications of supply chain visibility, cooperation, trust, and behavioral uncertainty. Implications and suggestions for further research are provided.
Environmental concerns, from consumers, governments, and academics, have encouraged businesses to incorporate more environmentally conscious designs in their new product development. However, selection of the best green design is a decision-making process that is not easy to address. Life cycle assessment (LCA) is a popular and comprehensive tool to accomplish the objective. Nevertheless, LCA is a time-consuming process that requires substantial resources and expertise. This research proposes an innovative approach to performing structured LCA in conjunction with the concept known as fuzzy analytical hierarchical process. In doing so, some of the disadvantages of LCA can be remedied and this provides a practical tool for performing LCA.
Prior research argues that alignment between business and information systems (IS) strategies enhances organizational performance. However, factors affecting alignment have received relatively little empirical attention. Moreover, IS strategic alignment is assumed to facilitate the performance of all organizations, regardless of type or business strategy. By using two studies of business firms and academic institutions, this paper: 1) develops and tests a model relating alignment, its antecedents, and its consequences and 2) examines differences in these relationships across organizational types and strategies. Findings indicate that alignment depends on shared domain knowledge and prior IS success, and also support the expected positive impact of alignment on organizational performance. Differences across Prospector, Analyzer, and Defender business strategies are examined. A key research contribution is the empirical demonstration that the importance of alignment, as well as the mechanisms used to attain alignment, vary by business strategy and industry. In past alignment studies, controlling for industry has not been uncommon. The findings suggest that future research studies should also control for business strategy. The article also empirically demonstrates that past implementation success influences alignment. In addition, it highlights the influence of a process variable, strategic planning, on the development of shared knowledge and, consequently, on alignment. This paper examines strategic issues related to the management of technology. Data from multiple surveys are used to test the extent to which strategic planning, shared business-IS knowledge, prior IS success, and other variables consistently enhance IS alignment. The study also provides empirical support for the popular argument that IS alignment improves organizational performance. It extends the current literature by examining the extent to which these findings hold across firm strategies and industries. The authors argue that not all firms are equally well served by allocating scarce resources to improve IS alignment.
The burgeoning environmental regulations are forcing companies to green their supply chains by integrating all of their business value-adding operations so as to minimize the impact on the environment. One dimension of greening the supply chain is extending the forward supply chain to collection and recovery of products in a closed-loop configuration. Remanufacturing is the basis of profit-oriented reverse logistics in which recovered products are restored to a marketable condition in order to be resold to the primary or secondary market. In this paper, we introduce a multiechelon multicommodity facility location problem with a trading price of carbon emissions and a cost of procurement. The company might either incur costs if the carbon cap, normally assigned by regulatory agencies, is lower than the total emissions, or gain profit if the carbon cap is higher than the total emissions. A numerical study is presented which studies the impact of different carbon prices on cost and configuration of supply chains.
While business analytics is being increasingly used to gain data-driven insights to support decision making, little research exists regarding the mechanism through which business analytics can be used to improve decision-making effectiveness (DME) at the organizational level. Drawing on the information processing view and contingency theory, this paper develops a research model linking business analytics to organizational DME. The research model is tested using structural equation modeling based on 740 responses collected from U.K. businesses. The key findings demonstrate that business analytics, through the mediation of a data-driven environment, positively influences information processing capability, which in turn has a positive effect on DME. The findings also demonstrate that the paths from business analytics to DME have no statistical differences between large and medium companies, but some differences between manufacturing and professional service industries. Our findings contribute to the business analytics literature by providing useful insights into business analytics applications and the facilitation of data-driven decision making. They also contribute to managers' knowledge and understanding by demonstrating how business analytics should be implemented to improve DME.
In this paper, we consider production planning when inputs have different and uncertain quality levels, and there are capacity constraints. This situation is typical of most remanufacturing environments, where inputs are product returns (also called cores). Production (remanufacturing) cost increases as the quality level decreases, and any unused cores may be salvaged at a value that increases with their quality level. Decision variables include, for each period and under a certain probabilistic scenario, the amount of cores to grade, the amount to remanufacture for each quality level, and the amount of inventory to carry over for future periods for ungraded cores, graded cores, and finished remanufactured products. Our model is grounded with data collected at a major original equipment manufacturer that also remanufactures. We formulate the problem as a stochastic program; although it is a large linear program, it can be solved easily using Cplex. We provide a numeric study to generate insights into the nature of the solution.
This research addresses the sustainability and safety related challenges associated with the complex, practical, and real-time maritime transportation problem, and proposes a multiobjective mathematical model integrating different shipping operations. A mixed integer nonlinear programming (MINLP) model is formulated considering different maritime operations, such as routing and scheduling of ships, time window concept considering port's high tidal scenario, discrete planning horizon, loading/unloading operation, carbon emission from the vessel, and ship's draft restriction for maintaining the vessel's safety at the port. The relationship between fuel consumption and vessel speed optimization is included in the model for the estimation of the total fuel consumed and carbon emission from each vessel. Time window concept considered in the problem aims to improve the service level of the port by imposing different penalty charges associated with the early arrival of the vessel before the starting of the time window and vessel failing to finish its operation within the allotted time window. Another practical aspect of the maritime transportation such as high tide scenario is included in the model to depict the vessel arrival and departure time at a port. Two novel algorithms-Nondominated sorting genetic algorithm II (NSGA-II) and Multiobjective particle swarm optimization have been applied to solve the multiobjective mathematical model. The illustrative examples inspired from the real-life problems of an international shipping company are considered for application. The experimental results, comparative, and sensitivity analysis demonstrate the robustness of the proposed model.
Risk management in supply chains is receiving increasing attention in both academia and industry. Firms are recognizing the importance of considering supply risk in evaluating and selecting suppliers for strategic partnerships. One of the critical issues faced by purchasing managers is in effectively defining, operationalizing, and incorporating supply risk measures in the supplier evaluation process. Due to the multidimensional nature of supply risk, analytical tools that can effectively integrate various risk measures into the decision process can prove useful for managers. To this end, the contribution of this paper is twofold. First, we consider extant research in supply risk in developing a framework for risk assessment based on various categories and types of risks. Second, we propose a combination of analytic hierarchy process and goal programming as a decision tool for supplier selection in the presence of risk measures and product life cycle considerations. The efficacy of the model is tested at a mid-sized automotive supplier and managerial implications are discussed.
Hybrid flow shop scheduling problem (HFSP) has been extensively discussed and the main objectives are related to completion time. The reduction of energy consumption should be considered fully in HFSP in the era of green manufacturing. In this study, biobjective energy-efficient HFSP is considered, which is made up of three subproblems including scheduling, machine assignment, and speed selection. A three-string coding method is used to indicate solutions of three subproblems. A new teachers' teaching-learning-based optimization (TTLBO) is proposed to minimize total energy consumption and total tardiness. Total tardiness is regarded as a key objective and a lexicographical method is adopted to compare solutions. TTLBO generates new solutions using a new optimization mechanism and is made up of the self-learning, interactive learning, and teaching of teachers. The learning phase of students are deleted from the algorithm. Multiple neighborhood searches are used to implement the self-learning of teachers and global search based on crossover is chosen to imitate other tivities of teachers. A number of experiments are conducted to test the impact of the new optimization meachanism on the performance of TTLBO and compare TTLBO with other algorithms from the literature. The computational results show that TTLBO is a competitive algorithm for the considered HFSP.
Infrastructure projects regularly experience cost and schedule overruns. Research led by Flyvbjerg has suggested that misrepresentation and optimism bias are primary causes for overruns. While Flyvbjerg's research has made a significant contribution to ameliorating understanding as to why economic infrastructure projects experience overruns, it does not adequately explain why this is the case for such social infrastructure. In addressing this shortcoming, case studies are used to determine the intermediary events and actions that contributed to project cost overruns. The pathogens, events, and actions that contributed to overruns are identified and analyzed. The analysis of the cases' findings led to the propagation of a nomological framework for social infrastructure project overruns. Acknowledgment of the systemic pathogenic influences has enabled the establishment of an orthodoxy, which provides an impetus for addressing the issues needed to improve the performance of social infrastructure projects.
Systems engineering of products, processes, and organizations requires tools and techniques for system decomposition and integration. A design structure matrix (DSM) provides a simple, compact, and visual representation of a complex system that supports innovative solutions to decomposition and integration problems. The advantages of DSMs vis-a-vis alternative system representation and analysis techniques have led to their increasing use in a variety of contexts, including product development; project planning, project management, systems engineering, and organization design. This paper reviews two types of DSMs, static and time-based DSMs, and four DSM applications: (1) component-based or architecture DSM, useful for modeling system component relationships and facilitating appropriate architectural decomposition strategies; (2) team-based or organization DSM, beneficial for designing integrated organization structures that account for team interactions; (3) activity-based or schedule DSM, advantageous for modeling the information flow among process activities; and (4) parameter-based (or low-level schedule) DSM, effective for integrating low-level design processes based on physical design parameter relationships. A discussion of each application is accompanied by an industrial example. The review leads to conclusions regarding the benefits of DSMs in practice and barriers to their use. The paper also discusses research directions and new DSM applications, both of which may be approached with a perspective on the four types of DSMs and their relationships.
An e-vendor's website inseparably embodies an interaction with the vendor and an interaction with the IT website interface. Accordingly, research has shown two sets of unrelated usage antecedents by customers: (1) customer trust in the e-vendor and (2) customer assessments of the IT itself, specifically the perceived usefulness and perceived ease-of-use of the website as depicted in the technology acceptance model (TAM). Research suggests, however, that the degree and impact of trust, perceived usefulness, and perceived ease of use change with experience. Using existing, validated scales, this study describes a free-simulation experiment that compares the degree and relative importance of customer trust in an e-vendor vis-a-vis TAM constructs of the website, between potential (i.e., new) customers and repeat (i.e., experienced) ones. The study found that repeat customers trusted the e-vendor more, perceived the website to be more useful and easier to use, and were more inclined to purchase from it. The data also show that while repeat customers' purchase intentions were influenced by both their trust in the e-vendor and their perception that the website was useful, potential customers were not influenced by perceived usefulness, but only by their trust in the e-vendor. Implications of this apparent trust-barrier and guidelines for practice are discussed.
Although considerable research has investigated the influence of institutional pressures on environmental practices, few studies have examined its impact on innovation among third-party logistics (3PL) providers. Moreover, extant scholarship equivocates on the relationships between various institutional pressures and green innovation. Different industries develop unique responses to their contexts and 3PL providers may counter environmental issues unlike other sectors (e.g., manufacturing). Integrating the institutional theory and natural-resource-based view, this paper examines green innovations of 3PL providers as a response to their institutional pressures and to gain competitive advantages, as well as explores the contingent effect of market uncertainty. Based on survey data acquired from 165 3PL providers in China, the empirical results suggest customer pressure and competitive pressure significantly impel 3PL providers to adopt a green innovation, while to this point, regulatory pressure has not affected such innovation. Additionally, green innovation positively affects financial performance for 3PL providers in China. Moreover, market uncertainty alleviates the driving effect of customer pressure on green innovation, but amplifies its contributive effect on performance. Research implications for green innovation in the 3PL context, practical implications, and future research direction are discussed.