a borítólapra  Súgó epa Copyright 
Acta polytechnica HungaricaVolume 12, Issue No. 4. (2015.)

Tartalom

  • Nan Ma ,
    Hamido Fujita ,
    Yun Zhai ,
    Shupeng Wang :

    Abstract: Fuzzy cognitive maps (FCMs) have been widely employed in the dynamic simulation and analysis in the complex systems. While a novel classifier model based on FCMs (FCMCM)was proposed in our former work, the obvious bottleneck of the genetic leaning algorithm used in FCMCM is its irksome efficiency, in particular, low speed in cross over and mutation delay in global convergence. Moreover the lack of the necessary robustness of a single FCMCM limits its generalization. To this end, a quantum computation based ensemble method FCMCM_QC is proposed to address the scalability problem, which employs a novel evolutionary algorithm inspired by quantum computation. The FCMCM_QC effectively uses the concept and principle of quantum computation to facilitate the computational complexity of genetic optimization for the FCMCM and reasonably selects classifiers with better performance for efficient ensembles. The experimental studies demonstrate the quality of the proposed FCMCM_QC in generally used UCI datasets, and the simulation results prove that the FCMCM_QC does enhance the speed of the convergence with high efficiency and good quality.

    Keywords: fuzzy cognitive maps; classifier model; quantum computation

  • Tamás Bécsi ,
    Szilárd Aradi ,
    Péter Gáspár :
    Using Train Interconnection for Intra-train Communication via CAN27-38en [673.01 kB - PDF]EPA-02461-00058-0020

    Abstract: This paper presents the possibilities for using currently installed interconnection cables for extended intra-train communication purposes and proposes a CAN-based communication solution for the problem. The implementation of such a communication technique raises feasibility issues due to the non-standard physical media and the non-fixed network topologies, such as, the determination of an achievable bandwidth, the dynamic termination of the network and the enumeration of units. As a basis of on-board passenger information and telemetry solutions for the operator, this solution could be a cost effective way to improve the quality of service of rail transportation. This paper describes the specialties of the vehicular environment and the proposed network model; it then presents a network lookup algorithm for automated enrolling and ordering the needed train units. Laboratory and field measurements are presented to validate the feasibility of this solution.

    Keywords: rail transport; vehicle; on-board communication; CAN Network; UIC558

  • Chang-Chih Chung ,
    Chih-Min Lin :

    Abstract: A brain emotional cerebellar model articulation controller (BECMAC) is developed, which is a mathematical model that approximates the judgmental and emotional activity of a brain. A fuzzy inference system is incorporated into the BECMAC, to give the novel fuzzy brain emotional cerebellar model articulation controller (FBECMAC) that is also proposed in this paper. The developed FBECMAC has the benefit of fuzzy inference and judgment and emotional activity, and it is used to control multi-input multi-output nonlinear systems. A 3-dimensional (3D) chaotic system and a mass spring damper mechanical system are simulated, to illustrate the effectiveness of the proposed control method. A comparison between the proposed FBECMAC and other controller shows that the proposed controller exercises better control than the other controllers.

    Keywords: brain emotional cerebellar model articulation controller; fuzzy system; chaotic system; mass spring damper mechanical system

  • László Lengyel :
    Validating Rule-based Algorithms59-75en [420.39 kB - PDF]EPA-02461-00058-0040

    Abstract: A rule-based system is a series of if-then statements that utilizes a set of assertions, to which rules are created on how to act upon those assertions. Rule-based systems often construct the basis of software artifacts which can provide answers to problems in place of human experts. Such systems are also referred as expert systems. Rule-based solutions are also widely applied in artificial intelligence-based systems, and graph rewriting is one of the most frequently applied implementation techniques for their realization. As the necessity for reliable rule-based systems increases, so emerges the field of research regarding verification and validation of graph rewriting-based approaches. Verification and validation indicate determining the accuracy of a model transformation / rule-based system, and ensure that the processing output satisfies specific conditions. This paper introduces the concept of taming the complexity of these verification/validation solutions by starting with the most general case and moving towards more specific solutions. Furthermore, we provide a dynamic (online) method to support the validation of algorithms designed and executed in rule-based systems. The proposed approach is based on a graph rewriting-based solution.

    Keywords: verification/validation of rule-based systems; graph rewriting-based model transformations; dynamic verification of model transformations

  • Raul Robu ,
    Ştefan Holban :
    The Analysis and Classification of Birth Data77-96en [610.47 kB - PDF]EPA-02461-00058-0050

    Abstract: The paper presents a study regarding the births that took place at the Bega Obstetrics and Gynecology Clinique, Timişoara, Romania in 2010. The analysis began from a dataset including 2325 births. The article presents a synthesis of the studies that analyze birth data. The Apgar score is the main subject in many studies. On one hand, researchers investigated the relation between the Apgar score and different factors such as the newborns’ cry, the level of glucose in the blood from the umbilical cord, the mother’s body mass index before the pregnancy, etc. On the other hand, there are studies that demonstrate that the Apgar score is important for the ulterior evolution of babies. The article presents the attributes from the dataset and how they were preprocessed in order to be analyzed with Weka. The values of each attribute were investigated and the results were presented. The past experience regarding births, expressed through the dataset values, was then used to build classification models. With the help of these models, the Apgar score can be estimated based on the known information regarding the mother, the baby and possible medical interventions. The purpose of these estimations is consultative, to help in identifying which values of the input variables will lead to an optimal Apgar score, in certain circumstances. The classification models were built and tested with the help of ten classification algorithms. After the model that produces the best results of classification was determined, a dedicated application was developed, with the aid of the Weka API which classifies the birth data by using LogitBoost algorithm.

    Keywords: birth data; classification; data mining; LogitBoost; Weka

  • Norbert Sram ,
    Márta Takács :
    An Ontology Model-based Minnesota Code97-112en [632.22 kB - PDF]EPA-02461-00058-0060

    Abstract: In this paper, the authors present an approach towards modeling a classical expert system using an ontology-based solution. The aim was to have an extensible setup, where multiple reasoning methods can be used, to provide the desired outcome. The case study, is a hierarchical rule-based system, for the evaluation of reference ECG signals called the Minnesota Code. This paper describes the practical limitations of the original expert system based definition, of the Minnesota code and describes an approach to represent it as an ontology that provides support for various reasoning methods. The authors present here, a possible solution to use the ontology model and ontology reasoning to provide a diagnostic evaluation of ECG information added to the Minnesota code ontology that corresponds to the rules defined by the expert system based solution.

    Keywords: ontology; expert system; Minnesota Code

  • Ádám Thiele ,
    Jiří Hošek :

    Abstract: In order to facilitate everyday archaeometallographic research into archaeological and/or historical objects, a method employing results of metallographic examination and hardness measurements to estimate phosphorus content in iron artifacts is introduced in the paper. Furthermore, phosphorus contents encountered in phosphoric iron that was used deliberately as a special material (for pattern-welding etc.) are discussed here. Despite certain limitations, the proposed method can be used for the estimation of the phosphorus content of archaeological iron examined either currently or in the past.

    Keywords: Phosphoric iron; archaeometallurgy; archaeometallography; Vickers hardness

  • Béla Takrics ,
    Péter Baranyi :

    Abstract: The aim of this paper is to fit the friction compensation problem in the field of modern polytopic and Linear Matrix Inequality (LMI) based control design methodologies. The paper proves that the exact Tensor Product (TP) type polytopic representations of most commonly utilized friction models such as Coulomb, Stribeck and LuGre exist. The paper also determines and evaluates these TP models via a TP model transformation. The conceptual use of the TP model of the friction is demonstrated via a complex control design problem of a 2D aeroelastic wing section. The paper shows how the friction model and the model of the aeroelastic wing section can be merged and transformed to a TP type polytopic model - by TP model transformation - whereupon LMI based control performance optimization can immediately be executed to yield an observer based output feedback control solution to given specifications. The example is evaluated via numerical simulations.

    Keywords: friction compensation; LMI-based multi-objective control; TP model transformation; qLPV systems

  • Borislav Đorđević ,
    Nemanjam Maček ,
    Valentina Timčenko :

    Abstract: This paper examines the performance of bare-metal hypervisors within the context of Quality of Service evaluation for Cloud Computing. Special attention is paid to the Linux KVM hypervisors’ different cache modes. The main goal was to define analytical models of all caching modes provided by hypervisor according to the general service time equation. Postmark benchmark software is used for random performance testing with two different workloads, consisting of relatively small objects that simulate a typical mail server. Sequential performance is evaluated with Bonnie++ benchmark software. The experiments were conducted separately with a single virtual machine, and, two and three virtual machines running on the hypervisor. The interpretation of obtained benchmark results according to the proposed models is the main contribution of this research.

    Keywords: cloud computing; virtualization; hypervisor; KVM; cache modes

  • Matotek Marija ,
    Barać Ivan ,
    Regodić Dušan ,
    Grubor Gojko :
    Supply Chain Risk Management Using Software Tool167-182en [628.80 kB - PDF]EPA-02461-00058-0100

    Abstract: Risk management is an integrating part of every supply chain aspect. Unsuccessful risk management can have negative economic and ecologic consequences and cause total or partial lost in the business of a company. Supply Chain Risk Management (SCRM) technology helps managers plan for and handle disruptions in the supply chain. Companies that invest in the emerging field of SCRM technology are less likely to sustain costly supply disruptions or negative press because of the actions of their suppliers. In this paper a model of risk management in supply chain is suggested, according to the international standard ISO 31000:2009 recommendation. An automatic risk evaluation has been performed by software application and as a result 2393 individual risk assessments have been obtained. In this case study the levels of risk factors obtained by risk treatment have been accepted. In order to lessen the subjectivity of the analysts it is necessary to use the same methodology for risk assessment by more than one risk analysts and then to compare their results to provide an objective risk assessment, a key phase in risk management.

    Keywords: risk; risk management; software tool; supply chain

  • Regina Reicher ,
    Ágnes Szeghegyi :

    Abstract: The situation of Hungarian micro, small and medium-sized enterprises has become very difficult as the result of the economic changes of the recent years. The open European market, the economic crisis and the ever increasing competition demand fast reaction, creating a serious challenge for these enterprises coping with a lack of capital and other resources. Therefore, companies in the SME sector pay an increasing attention to serve their customers at a high level. In order to achieve this, they often seek an IT solution. Owing to these circumstances it is extremely important for the enterprises coping with harsh economic conditions to choose and implement the most suitable CRM IT solution quickly, efficiently and at the lowest risk of possible failure. This requires, however, the thorough knowledge of competencies of the organization, the wide range of solutions available on the market and adequate methodology which offers success for them. The present research aims to explore the factors affecting the decision-making process in the course of which the SME heads select and implement a CRM system. As the result of our research, the specification defined by experts and heads of user companies could be determined and classified.

    Keywords: CRM implementation; CRM selection; factors affecting successful implementation; SME

  • Vladimir Djakovic ,
    Igor Mladenovic ,
    Goran Andjelic :

    Abstract: The research presented in the study is the analysis and implementation of parametric and non-parametric Value at Risk (VaR) calculation models for predicting risk and determining the maximum potential loss from investment activities. The study sample includes stock indices of Serbian (BELEX15), Hungarian (BUX), Croatian (CROBEX) and Slovenian (SBITOP) markets, from 1st January 2006 to 31st December 2012. The methodology connotes the use of analysis and synthesis, as well as relevant statistical and mathematical methods. The study is based on the assumption that there is no statistically significant difference among the different models of risk management, in relation to the performance of investment risk prediction in the markets of the observed transition economies. The main aim of the study is to assess the performances of risk management models in practice, in order to operationally optimize investment decisions. The research results indicate the implementation adequacy of the tested models in the observed transitional markets, with full consideration of their specifics.

    Keywords: risk management; value at risk; extreme value theory; historical simulation; delta normal VaR

  • Saeed Alaei ,
    Alireza Hajji ,
    Reza Alaei ,
    Masoud Behravesh :

    Abstract: In this paper, we study the differences between the Centralized and Decentralized approaches in a two-echelon stochastic inventory system under the lost sale policy. We formulate the condition in which the retailer applies (r, Q) inventory policy, and his relation with the upper echelon who acts as a manufacturer. This situation has not been considered in the literature before. The Centralized approach results in optimal solution of the system and the Decentralized one is based on Stackelberg game in which the manufacturer is the leader. The demand arrives according to the stationary Poisson process. We drive the long-run average cost functions, then a set of computational steps are developed to obtain the solutions. Furthermore, we provide a numerical study to compare the two approaches. Here are some conclusions: (a) the Decentralized approach reduces the system’s cost efficiency; (b) moreover, the Decentralized approach raises the lead time, the order quantity, and the supply chain inventory relative to the Centralized approach.

    Keywords: Continuous Review Policy; Two-Echelon Inventory System; Inventory Management; Stackelberg Game