While several studies have investigated learners' opinions toward using Facebook in learning, limited attention has been paid to examine the effectiveness of Facebook as a learning tool in classrooms. Thus, this article proposes a newly designed Facebook learning tool that is used in a public Tunisian university to learn the “game development” course. It then investigates its impact on the learners' level of knowledge and motivation compared to the traditional learning method. This article also investigates the impact of this tool on the learners' technology acceptance and attitudes. The experimental results showed that the Facebook learning tool can significantly improve the learners' level of knowledge. In addition, learners who learned with this tool revealed a high degree of perceived usefulness, security, and intention to use the Facebook learning tool again. Furthermore, these learners reported a favorable attitude towards the Facebook learning tool. In addition, a set of recommendation is found that researchers and educators should consider while using Facebook in their classrooms.
This article reports on a horizontal merger of two insurance companies and their failure to properly integrate their information systems. The task that was supported by both well-founded market research and external consultants proved more challenging than thought due to the complexity and interconnectedness of related business processes. The main difficulties arose in the area of skill development, skill retention, and management buy-in. Thereby, this article adds valuable insights to the stream of case studies of merger and acquisition activities through providing deeper insights into IS integration, which is by most contributions treated as a black box.
This qualitative case study features SteadyServ, a beer inventory monitoring solution vendor, and its internet of things (IOT)-based radio frequency identification (RFID)-enabled technology solution. IOT-based information technology systems are powering new business models and innovative cloud-supported solutions addressing use cases in many industries. At this early stage, IOT is taking off in monitoring premises, products, supply chains, and customers. Content analysis of primary and secondary data was used to interpret this firm's RFID-enabled IOT solution's deployment at the firm's customer sites. Two theoretical frameworks were used in the attempt to understand the IT solution experiences of customer firms: socio-technical systems theory and affordances theory.
Cloud computing has evolved as a new paradigm for management of an infrastructure and gained ample consideration in both industrial and academic area of research. A hidden Markov model (HMM) combined with Markov games can give a solution that may act as a countermeasure for many cyber security threats and malicious intrusions in a network or in a cloud. A HMM can be trained by using training sequences that may be obtained by analyzing the file traces of packet analyzer like Wireshark network analyzer. In this article, the authors have proposed a model in which HMM can be build using a set of training examples that are obtained by using a network analyzer (i.e., Wireshark). As it is not an intrusion detection system, the obtained file traces may be used as training examples to test a HMM model. It also predicts a probability value for each tested sequence and states if sequence is anomalous or not. A numerical example is also shown in this article that calculates the most optimal sequence of observations for both HMM and state sequence probabilities in case a HMM model is already given.
Prediction of software development is the key task for the effective management of any software industry. The accuracy and reliability of the prediction mechanisms used for the estimation of software development effort is also important. A series of experiments are conducted to gradually progress towards the improved accurate estimation of the software development effort. However, while conducting these experiments, it was found that the size of the training set was not sufficient to train a large and complex artificial neural network (ANN). To overcome the problem of the size of the available training data set, a novel multilayered architecture based on a neural network model is proposed. The accuracy of the proposed multi-layered model is assessed using different criteria, which proves the pre-eminence of the proposed model.
Cancer is a disease in which cells in body grow and divide beyond the control. Breast cancer is the second most common disease after lung cancer in women. Incredible advances in health sciences and biotechnology have prompted a huge amount of gene expression and clinical data. Machine learning techniques are improving the prior detection of breast cancer from this data. The research work carried out focuses on the application of machine learning methods, data analytic techniques, tools, and frameworks in the field of breast cancer research with respect to cancer survivability, cancer recurrence, cancer prediction and detection. Some of the widely used machine learning techniques used for detection of breast cancer are support vector machine and artificial neural network. Apache Spark data processing engine is found to be compatible with most of the machine learning frameworks.
Cloud computing is not associated with a specific technology, instead it as an alternative method to deliver technology as a service. This article investigates current cloud computing adoption in the United States (USA) and United Kingdom (UK) hedge fund industry. Hedge fund technologists, prime service consultants, technology service providers, industry application vendors, investors and an independent information security consultant participated were surveyed for this article. The article acknowledges the growth of cloud computing in the hedge fund sector. This research work also highlights that the private cloud definition is vague and requires further classification, elaborating on the variants of private cloud. This is important as the variants of private cloud computing offer varying benefits and risk which the hedge fund sector has proven to be sensitive. Equally, this article argues that some of the current security concerns are over-stated and perhaps reflect a conservative decision making framework rather than a realistic consideration of the options.
Traditional cybersecurity, security or information security awareness programs have become ineffective to change people's behavior in recognizing, failing to block or reporting cyberthreats within their organizational environment. As a result, human errors and actions continue to demonstrate that we are the weakest links in cybersecurity. This article studies the most recent cybersecurity awareness programs and its attributes. Furthermore, the authors compiled recent awareness methodologies, frameworks and approaches. The authors introduce a suggested awareness training model to address existing deficiencies in awareness training. The Cybersecurity Awareness TRAining Model (CATRAM) has been designed to deliver training to different organizational audiences, each of these groups with specific content and separate objectives. The authors concluded their study by addressing the need of future research to target new approaches to keep cybersecurity awareness focused on the everchanging cyberthreat landscape.
Inadequacies and faults that are the reason of vulnerabilities are hazardous for the websites. The authors' chief intention is to trim down those assailon websites by restraining the attributes accountable for these assails. This article categorizes those cases/ attributes into 10 categories and arranges them inpriority according to their severity. These attributes influence which contributes to losses in terms of monetary as well as humanity. By prioritizing these attributes, web designers as well as users will check twice on these aspects before entering confidential information into the website. The opinion of different web designers and experts of different companies was captured to prioritize these attributes using an analytical hierarchical process and two way assessment methods so that the loss should be minimized. Research confirms that total severity measure aids the severities of these attribute's contribution towards vulnerabilities. The findings of the two way assessment technique show that there is only one such attribute which happens to be extremely severe in comparison to other attributes and needs imperative consideration while designing websites and also by users before entering their confidential credentials on a website to curtail the losses caused by black hat guys.
Accurate time and budget is an essential estimate for planning software projects correctly. Quite often, the software projects fall into unrealistic estimates and the core reason generally owes to problems with the requirement analysis. For investigating such problems, risk has to identified and assessed at the requirement engineering phase only so that defects do not seep down to other software development phases. This article proposes a multi-criteria risk assessment model to compute risk at a requirement level by computing cumulative risk score based on a weighted score assigned to each criterion. The result of comparison with other approaches and experimentation shows that using this model it is possible to predict the risk at the early phase of software development life cycle with high accuracy.
The advent of artificial intelligence (AI) technology in the education sector has largely taken over conventional classrooms and revolutionized the way education is conducted to the admiration of many. Other scholars however, believe that such early celebration of AI benefits is unfounded and inimical to the education sector since the adoption of modern AI teaching systems now raises long-term issues about the relevance of teachers and their classrooms in 21st Century AI education. The Marxian Alienation Theory was adopted for the article. The Ex-post factor method and Derrida's critical method of analysis was utilized for attaining the objectives of the article. The article faults recent attempts at eulogizing the impact of AI innovations in the education sector and on human development. Extensive research is proposed as necessary for contemporary scholars of AI and education technologist before proper appropriation can be made about its gains in education and on human development.
In this competitive atmosphere, internet marketing serves as an effective alternate trade network that businesses can focus on to get direct access to their target customers. Online buying facilitates shoppers to purchase goods and services right away from the merchants over the internet and without human interference. Consumers find produce of their preference with just click of a button. A complete series of merchandise from various dealers with their prices get exhibited to the buyers for their choice. But, is competitive prices and good product features sufficient for the buyers for their buying decisions? This article attempts to answer this ambiguous condition and try to realize the vital aspects for buyers' satisfaction. A customer satisfaction index (ECCSI) is studied to comprehend the features of online satisfaction and with a comparative analysis using analytical hierarchy process (AHP) and intuitionistic fuzzy TOPSIS (IFT), these different e-satisfaction indices are ranked to get an overall view of preferences of the customers.
A brain hemorrhage is one type of stroke, which is caused due to artery burst in the brain, killing the brain cells due to bleeding. Therefore, to reduce the criticality among the patients, for treatment, the doctors depend on accurate reports on the location of hemorrhage. Magnetic resonance imaging (MRI) is one of the best imaging modality when functional and structural abnormalities need to be found. To aid the identification of presence of abnormality, a novel NB-PKC algorithm for effective recognition of brain hemorrhages in MRI is proposed. A series of preprocessing is done, then the image undergoes binary thresholding process for applying an image mask on the hemorrhage region. Then for segmentation a modified multi-level segmenting algorithm is applied, using minimal local binary pattern and GLCM, combined features are extracted and finally for classification a novel Naïve Bayes- Probabilistic Kernel Classification is applied. These techniques designed could accurately identify the position and classified whether the image had an abnormality or not and could reduce human errors.
Multi-dimensional data is present across multimedia, data mining and other data-driven applications. The R-Tree is a popular index structure that DBMSs are implementing as core for efficient retrieval of such data. The gap between the best and worst-case performance is very wide in an R-tree. Thus, building quality R-trees quickly is desirable. Variations differ in how node overflow are approached during the building process. This article studies the R-Tree technique that the open-source PostgreSQL DBMS uses. Focus is on a specific parameter controlling node overflows as an optimisation target, and improved configurations are proposed. This parameter is hard-wired into the DBMS, and therefore, an implementation is presented to allow this parameter to become accessible through an SQL construct. The access method designer can resort to configuring this parameter when trying to meet specific storage or time-related performance targets. With this study, the reader can gain an insight into the effects of changing the parameter by considering the spatial indexes on well-known workloads.
Today's software applications deployed in an enterprise to cater to the complex business processes, integrate various business units and address requirements of a global customer base. The traditional methodology of software engineering succumbs to the changing need of customer and technology advancement. On the behest of the customer, a software system should be designed in a way that it goes in concert with the present user needs. Agile methodology targets complex systems with its iterative, incremental, and evolutionary approach. There are numerous factors attributing towards the successful implementation of agile methodology. This led to adopting an approach of agile based on ‘lean' principles over the traditional software development life cycle (SDLC) approach. Collaborative work is done with the project team on a priority list. The implementation is done through “SCRUM” an empirical framework for learning. It has multiple sprints which are deliverable products. This idea has substantially reduced the ‘time to market' as the customer can decide which features of the software they would like to be delivered on a priority basis. To model trends of fault detection in each sprint, a growth model of software reliability is used. This research article presents a framework to analyze and measure the cumulative errors in an Agile Testing Process, the authors have applied modeling on various SRGMs to prove acceptability in an agile development process and finally compares these models using the Mahalanobis Distance Formula for Model ranking. The Mahalanobis distance criteria is easy to compute and that can be utilized to get the ranks and select the best model in view of an arrangement of contributing criteria.
This research article explores the factors of shared leadership in IT sector in India. A reliable and a valid scale for the measurement of shared leadership (Scale for Measuring Shared Leadership, SMSL) is therefore developed as the previous researches brought to light the requirement of such scales which relate to the Indian IT sector. An attempt is made to reduce large number of variables, studied in relation to the shared leadership from various books and research journals, to a few workable factors and analyse how the factors derived explain the latent construct of shared leadership in the context of IT sector in India. The article also explores the factors of shared leadership using the factorial analysis of the data collected from teams working in the IT sector in India. It is a team level study of shared leadership in IT sector with a focus on obtaining the factors by using the factor analysis method on IBM SPSS. The various variables by which researchers have tried to explain the construct of shared leadership were collected from secondary sources which were then used to develop a questionnaire. The questionnaire was pilot tested and its reliability and validity was evaluated thereafter. Data collected was put to factor analysis through SPSS software to obtain the factors explaining the shared leadership construct in the context of the Indian IT sector. Numerous variables were reduced to few factors. With their help, these factors of the shared leadership in IT sector could be explained as a construct. The findings of this article also include explanation of the variations in the construct of shared leadership in IT sector and which factors contribute in what order to these variations.
The recent advent of Internet of Things (IoT), has given rise to a plethora of smart verticals- smart homes being one of them. Smart Home is a classic example of IoT, wherein smart appliances connected via home gateways constitute a local home network to assist people in activities of daily life. Smart Home involves IoT-based automation (such as smart lighting, heating, surveillance etc.), remote monitoring and control of smart appliances. Besides automation, human-in-the-loop is a unique characteristic of Smart home to offer personalized services. Understanding the human behavior requires context processing. Thus, enablement of Smart home involves two prominent technologies IoT and context-aware computing. Further, local devices lying in the smart home have the implicit location and situational information, hence fog computing can offer real-time smart home services. In this paper, the authors propose ICON (IoT-based CONtext-aware) framework for context-aware IoT applications such as smart home, further ICON leverages fog-based IoT middleware to perform context-aware processing.
Multi-channel contact centers are an increasingly important component of today's business world. They serve as a primary customer-facing channel for firms in many different industries, and employ millions of operators across the globe. During their operation, they generate vast amounts of data, ranging from automatically registered logs to handwritten notes and voice recordings. Unfortunately, in most firms, data of interest is unstructured, and stored in several databases, making their exploitation very hard. This article presents a decision support system for a multi-channel, multi-service contact center for front office business process outsourcing, along with its prospective extension to a decision management system. Its core is an enterprise-wide data warehouse, based on the general concept of an event. The proposed system supports a broad new set of advanced analysis tasks, ranging from operator performance assessment to call-flow simulation and data mining, providing operational and management staff the basis for taking effective operative and strategic decisions.
In spring 2016, the Distributed Autonomous Organization (The DAO) was created on Ethereum. As with Bitcoin, Ethereum uses a P2P network, where distributed ledgers are implemented as daisy-chained blocks of data. Ethereum's native cryptocurrency, Ethers are spent to execute pieces of code called smart contracts. Investors paid their Ethers for the DAO to operate and received the opportunity to vote on and become investors in venture projects proposed by Ethereum-based startups. Transactions and settlements between investors and startups are executed autonomously. The DAO experiment failed shortly after inception as an anonymous hacker stole over $50M USD worth of Ethers out of the $168M invested. The Ethereum community voted to return (or fork) the state of the network to one prior to the hack, returning Ethers back to investors and shuttering the DAO. However, this action arguably represented as a bailout—ironically, Bitcoin was conceived as a reaction against the 2008 bailout of US banks—and violated the ledger immutability and “code is law” ethos of the blockchain community.
Learning programming can be challenging, particularly object-oriented programming (OOP). However, using visualization could be useful in enhancing students' learning OOP concepts. In this study, the impact of using a 3D visual programming tool – Alice 2 – on student performance and attitude was explored in an introductory computer programming course using Java. Research participants were undergraduate computing students at Arab Open University – Jordan branch. Quasi-experimental design was adopted in this research, where two groups of students were chosen. The findings of this research showed that using Alice has positively impacted on students' performance and attitude towards computer programming and learning OOP concepts. The study suggests the incorporation of Alice in teaching introductory programming courses.