Refine
Year of publication
Document Type
- Conference Proceeding (68)
- Article (67)
- Part of a Book (13)
- Report (11)
- Doctoral Thesis (4)
- Book (2)
- Working Paper (2)
- Other (1)
Institute
- Forschungszentrum Business Informatics (168) (remove)
Is part of the Bibliography
- yes (168)
Keywords
- Evolution Strategies (5)
- Volatile organic compounds (5)
- Constrained Optimization (4)
- Stress testing (4)
- Evolution strategies (3)
- Evolution strategy (3)
- Fragmentation patterns (3)
- Global optimization (3)
- Humans (3)
- Optimization (3)
Daten im B2B-Ökosystem teilen und nutzen: Wie KMU Voraussetzungen schaffen und Hürden überwinden
(2024)
«Big Data» haben ein großes Potenzial, um die Wertschöpfung effizienter zu gestalten oder um Innovationen hervorzubringen. Daten werden oft an der Schnittstelle zwischen mehreren Akteuren in Business-to-Business-Ökosystemen generiert und sie müssen zwischen den Akteuren geteilt werden. Unternehmen tun sich jedoch schwer damit, Daten in Werte zu transferieren und die Daten im Ökosystem zu teilen. Ursächlich sind weniger technische Gründe als organisationale Rahmenbedingungen. Der Beitrag identifiziert fünf Perspektiven, die Hürden und Voraussetzungen in diesem Prozess darstellen: (1) eine datengetriebene Organisationskultur, (2) Vertrauen zwischen den Akteuren, (3) die Konkretisierung des Wertes von Daten, (4) Datensicherheit und (5) rechtliche und Governance-Aspekte. Eine Fallstudie eines typischen Daten-Ökosystems um ein produzierendes KMU konkretisiert diese Voraussetzungen und Hürden. Es zeigt sich, dass sich Unternehmen, die Daten im Ökosystem teilen möchten, ganzheitlich verändern müssen.
In this paper, we consider the question of data aggregation using the practical example of emissions data for economic activities for the sustainability assessment of regional bank clients. Given the current scarcity of company-specific emission data, an approximation relies on using available public data. These data are reported in different standards in different sources. To determine a mapping between the different standards, an adaptation to the Covariance Matrix Self-Adaptation Evolution Strategy is proposed. The obtained results show that high-quality mappings are found. Nevertheless, our approach is transferable to other data compatibility problems. These can be found in the merging of emissions data for other countries, or in bridging the gap between completely different data sets.
Why do some countries assign a major role to wind energy in decarbonizing their electricity systems, while others are much less committed to this technology? We argue that processes of (de-)legitimation, driven by discourse coalitions who strategically employ certain storylines in public debates, provide part of the answer. To illustrate our approach, we comparatively investigate public discourses surrounding wind energy in Austria and Switzerland, two countries that differ strongly in wind energy deployment. By combining a qualitative content analysis and a discourse network analysis of 808 newspaper articles published 2010–2020, we identify four distinct sets of storylines used to either delegitimize or legitimize the technology. Our study indicates that low deployment rates in Switzerland can be related to the prominence of delegitimizing storylines in the public discourse, which result in a rather low socio-political acceptance of wind energy. In Austria, by contrast, there is more consistent support for wind energy by discourse coalitions using a broad set of legitimizing storylines. By bridging the related but separate literatures of technology legitimacy and social acceptance, our study contributes to a better understanding of socio-political conflict and divergence in low-carbon technological pathways.
A step change is needed in the deployment of renewable energy if the triple challenge of ensuring climate change mitigation, energy security, and energy affordability is to be met. Yet, social acceptance of infrastructure projects and policies remains a key concern. While there has been decades of fruitful research on the social acceptance of wind energy and other renewables, much of the extant research is cross-sectional in nature, failing to capture the important dynamic processes that can make or break renewable energy projects. This paper introduces a Special Issue of Energy Policy which focuses on the neglected topic of the dynamics of social acceptance of renewable energy, drawing on contributions made at an international research conference held in St. Gallen (Switzerland) in June 2022. In addition to introducing these papers and drawing out common themes, we also seek to offer some conceptual clarity on the issue of dynamics in social acceptance, taking into account the influence of time, power, and scale in shaping decision-making processes. We conclude by highlighting a number of avenues of potential future research.
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
Das Forschungsprojekt Data Sharing Framework untersuchte Data Sharing im Kontext von datenbasierten Services und Produkten in Ökosystemen aus fünf Perspektiven: Kultur, Vertrauen, Wert, Recht & Governance, Sicherheit. Die Forschungsergebnisse bestätigen die Relevanz dieser Perspektiven und es hat sich gezeigt, dass diese Aspekte sowohl Barrieren als auch Treiber für Datennutzung und -austausch zwi- schen Unternehmen darstellen.
Ausgangspunkt waren die folgenden forschungs- und praxisleitenden Annahmen:
• These 1: KMU können durch die Nutzung und das
Teilen von Daten Mehrwerte in Form neuer Produkte und Services generieren. Aus wissenschaftlicher Sicht liegt der Fokus des Themas Daten und Data Science bisher überwiegend auf der technischen Umsetzung datenintensiver Geschäftsmodelle und Kooperationen durch die Unternehmen.
• These 2: Die technische Umsetzung ist eine notwendige Bedingung für die datenbasierte Leistun- gen, sie reicht jedoch nicht aus, um eine Kooperations- und Teilbereitschaft bei KMU hinsichtlich ihrer Daten (Daten-Teilbereitschaft) auszulösen. Zahlreiche Stakeholder zögern, Daten zu teilen, vor allem in einem grenzüberschreitenden Kontext, wie z.B. in der Programmregion.
• These 3: KMU benötigen Data Access und Data Trust Strukturen, um mögliche Kooperationspotenziale tatsächlich zu heben. Dies erfordert u.a. gemeinsa- me Standards, ein annäherndes Verständnis vom Wert der Daten, Data-Governance in Kombination mit zu definierenden Trust-Standards, welche die erforderliche formelle und informelle Sicherheit bieten.
Nachfolgend wird ein Überblick über die hieraus hervorgegangenen Ergebnisse gegeben:
Kultur
Die Perspektive der Organisationskultur stellt das Denken und Handeln im Unternehmen und im Ökosystem in den Mittelpunkt. Eine Organisationskultur, welche die Arbeit mit Daten, Data Science Praktiken und vor allem das Teilen von Daten ermöglicht, stellt Daten in den Mittelpunkt des Wertschöpfungsprozesses. Dies erfordert eine generelle Sensibilisierung
für das Thema Daten, durchlässige Grenzen im und zwischen Unternehmen, ebenso wie ein neues Verständnis von Rollen, Strukturen und Prozessen im Unternehmen.
Vertrauen
Das Vertrauen ist im Ökosystem von großer Bedeutung. Das Einbeziehen von internen Stakeholdern und das Starten mit kleineren Pilotprojekten wird vorgeschlagen, um Vertrauen innerhalb der Organisation und mit externen Partnern zu schaffen.
Wert
Als notwendige Voraussetzung wird der Wert der Daten hervorgehoben. Unternehmen sollten den potenziellen Wert der Datenflüsse kennen, bevor sie sich entscheiden, ob sie diese Daten teilen und nutzen möchten. Es wird empfohlen, eine grobe Quanti- fizierung des Wertflusses vorzunehmen oder gegebe- nenfalls eine detailliertere Analyse durchzuführen.
Recht & Governance
Für die Berücksichtigung rechtlicher Rahmenbedingungen gemeinsamer Datennutzung sollten Organisationen zunächst eine interne Data Governance etablieren, um auf neue regulatorische Entwicklungen reagieren zu können. Die Einrichtung von Data-Asset-Management, Data-IP und -Compliance-Ma-nagement und Data-Contract-Management wird hier empfohlen.
Datensicherheit
Im Sicherheitskontext sind Methoden zur Gewährleistung der Datenintegrität, Privatsphäre und Sicherheit entscheidend. Es wird empfohlen, einen kollaborativen Ansatz zur Implementierung von Sicherheitsstandards zu verfolgen und dabei IKT-Experten einzubeziehen. Anfänglich können Best Practices ausreichen, aber längerfristig sollte eine kontinuierliche Sicherheitsrisikobewertung und Ge- schäftsprozessintegration angestrebt werden.
Im vorliegenden Paper wird ein Vergleich zwischen Produktions-und Simulationsdaten präsentiert welches im Rahmen einer größeren Initiative zur Verwendung von Shopfloor Daten bei einem Projektpartner in der Automobilindustrie umgesetzt wurde. In diesem Projekt wurden die Daten die während der Füllbildsimulation entstehen mit den Daten aus der finalen Werkzeugabnahme verglichen um zu analysieren, wie genau diese miteinander über einstimmen. Je besser die Simulation ist, desto schneller kann der gesamte Werkzeugentwicklungsprozess abgewickelt werden, welcher als Kernprozess massives Einsparungspotenzial und damit Wettbewerbsvorteil mit sich bringt.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
A model is presented that allows for the calculation of the success probability by which a vanilla Evolution Strategy converges to the global optimizer of the Rastrigin test function. As a result a population size scaling formula will be derived that allows for an estimation of the population size needed to ensure a high convergence security depending on the search space dimensionality.
Through mandatory ESG (environmental, social, governance) reporting large companies must disclose their ESG activities showing how sustainability risks are incorporated in their decision-making and production processes. This disclosure obligation, however, does not apply to small and medium-sized enterprises (SME), creating a gap in the ESG dataset. Banks are therefore required to collect sustainability data of their SME customers independently to ensure complete ESG integration in the risk analysis process for loans. In this paper, we examine ESG risk analysis through a smart science approach laying the focus on possible value outcomes of sustainable smart services for banks as well as for their (SME) customers. The paper describes ESG factors, how services can be derived from them, targeted metrics of ESG and an ESG Service Creation Framework (business ecosystem building, process model, and value creation). The description of an exemplary use case highlighting the necessary ecosystem for service creation as well as the created value concludes the paper.
The role of entrepreneurs and intrapreneurs in the current zeitgeist is to drive innovation, re-shape rigid, established processes in business as well as for consumers. They use new viewpoints to pioneer new (business) models which focus on ‘smartness’ rather than the purely monetary and short-sighted models of yesteryear. Fostering and supporting the culture of this current zeitgeist is a mayor challenge for entre- and intrapreneurial support infrastructures, namely startup centres and innovation hubs of universities and other public institutions as well as innovation centres of private companies. Hereby, support may range from access to funding over provision of resources such as offices or computing hardware to coaching in the development of business ideas and strategic roadmaps for product and service deployment. In this paper, we focus on describing the status-quo of afore- mentioned support infrastructures in Vorarlberg and the Lake Constance region, then extend the scope to existing (international) approaches for aiding founders and inno- vators in the development of smart services. An analysis of success stories of the Vorarlberg startup centre ‘startupstube’ and other initiatives including their compar- ison to international counterparts builds the basis for a methodological framework for (service science) coaching in entre- and intrapreneurial support infrastructures. The paper is concluded by the description of a framework for choosing the right methods and tools to create service value in entre-/intrapreneurship based upon tested, proven know-how and for defining support infrastructure needs based upon pre-defined stakeholder and target groups as well as the (industry) sectors of the innovators.
Purpose: Although there is an apparent potential in using data for advanced services in manufacturing environments, SMEs are reluctant to share data with their ecosystem partners, which prevents them from leveraging this potential. Therefore, the purpose of this paper is to analyse the reasons behind these resistances. The argumentation paves the way for elaborating countermeasures that are adequate for the specific situation and the typical capabilities of SMEs.
Design/Methodology/Approach: The analysis is based on literature research and in-depth interviews with management representatives of 15 companies in manufacturing service ecosystems. Half of these are manufacturers and the other half technology or service providers for manufacturers. They are SMEs or partly larger companies operating in structures that are typical for SMEs.
Findings: Data sharing hurdles are investigated in the five dimensions, 1. quantifying the value of data, 2. willingness to share data and trust, 3. organizational culture and mindset, 4. legal aspects, and 5. security and privacy. The ability to quantify the value of data is a necessary but not sufficient precondition for data sharing, which must be enabled by adequate measures in the other four dimensions.
Originality/Value: The findings of this empirical study and the solution approach provide an SME-specific framework to analyze hurdles that must be overcome for sharing data in an ecosystem.
Manufacturing SMEs can apply the framework to overcome the hurdles by specific insights and solution approaches. Furthermore, the analysis illustrates the future research direction of the project towards a comprehensive solution approach for data sharing in a manufacturing ecosystem.
The design and development of smart products and services with data science enabled solutions forms a core topic of the current trend of digitalisation in industry. Enabling skilled staff, employees, and students to use data science in their daily work routine of designing such products and services is a key concern of higher education institutions, including universities, company workshop providers and in further education. The scope and usage scenario of this paper is to assess software modules (‘tools’) for integrated data and analytics as service (DAaaS). The tools are usually driven by machine learning, may be deployed in cloud infrastructures, and are specifically targeted at particular needs of the industrial manufacturing, production, or supply chain sector.
The paper describes existing theories and previous work, namely methods used in didactics, work done for visually designing and using machine learning algorithms (no-code / low- code tools), as well as combinations of these two topics. For tools available on the market, an extended assessment of their suitability for a set of learning scenarios and personas is discussed.
Smart services disrupt business models and have the potential to stimulate the circular economy transition of regions, enabling an environmentally friendly atmosphere for sustainable and innovation-driven growth of regions. Although smart services are powerful means for deploying circular economy goals in industrial practices, there is little systematic guidance on how the adoption of smart services could improve resource efficiency and stimulate smart regional innovation-driven growth, enabled through circular design. Implemented in the scope of Vorarlberg’s smart specialization strategy, this paper contributes to the literature on the circular economy and regional innovation-driven growth by assessing critical factors of the value creation and value capture implemented within the scope of the quadruple helix system. By identifying the main challenges and opportunities of collaborative value creation and value capture in setting-up smart circular economy strategies and by assessing the role of innovation actors within the quadruple helix innovation system, the study provides recommendations and set of guidelines for managers and public authorities in managing circular transition. Finally, based on the analysis of the role of actors in creating shared value and scaling-up smart circular economy practices in the quadruple helix innovation systems, the paper investigates the role of banks as enablers of circular economy innovation-driven regional growth and smart value creation.
Small and medium-sized enterprises often face resource deficits and there- fore depend on cooperating with other actors to stay innovative in a competitive environment. Establishing and maintaining actual co-creation and service inter- action strategies however is challenging. A reason for this is the complexity of finding methodologies and tools to create valuable outcome and the lack of knowledge of collaboration toolsets, also in virtual environments. This paper introduces an Innovation-Method-Framework consisting of innovation methods for increased service interaction and value co-creation among service stakeholders. Also, toolsets for the framework’s practical application are provided.
Recent developments in the area of Natural Language Processing (NLP) increasingly allow for the extension of such techniques to hitherto unidentified areas of application. This paper deals with the application of state-of-the-art NLP techniques to the domain of Product Safety Risk Assessment (PSRA). PSRA is concerned with the quantification of the risks a user is exposed to during product use. The use case arises from an important process of maintaining due diligence towards the customers of the company OMICRON electronics GmbH.
The paper proposes an approach to evaluate the consistency of human-made risk assessments that are proposed by potentially changing expert panels. Along the stages of this NLP-based approach, multiple insights into the PSRA process allow for an improved understanding of the related risk distribution within the product portfolio of the company. The findings aim at making the current process more transparent as well as at automating repetitive tasks. The results of this paper can be regarded as a first step to support domain experts in the risk assessment process.
Arbeitspaket 3: Ausschöpfung des Innovationspotentials von smarten Technologien - FH Vorarlberg
(2022)
To create a map of an unknown area, autonomous robots must follow a strategy to explore the area without knowing the optimal paths to reduce the time needed to map the whole area. To reduce the time to accomplish this task, multiple robots can work together to create a map in a more efficient way. However, without proper coordination, the time a team of autonomous robots needs to explore the unknown area can exceed the time needed by a single robot. To counteract the challenges, a shared infrastructure is needed which extracts useful information for the individual robots out of the shared information of all robots so the exploration can be coordinated. These measures introduce new challenges to the system, concerning the load of the communication infrastructure as well as the overall task of exploring and mapping becoming dependent on the correct communication and robustness of the shared team infrastructure. Therefore, the amount of communication and dependency of each individual robot of the rest of the other robots of the team must be reduced to ensure that the robots can continue working even if the communication with the shared infrastructure fails.
Mobility choices - an instrument for precise automatized travel behavior detection & analysis
(2021)
Towards a strategic management framework for engineering of organizational robustness and resilience
(2020)
For a given set of banks, how big can losses in bad economic or financial scenarios possibly get, and what are these bad scenarios? These are the two central questions of stress tests for banks and the banking system. Current stress tests select stress scenarios in a way which might leave aside many dangerous scenarios and thus create an illusion of safety; and which might consider highly implausible scenarios and thus trigger a false alarm. We show how to select scenarios systematically for a banking system in a context of multiple credit exposures. We demonstrate the application of our method in an example on the Spanish and Italian residential real estate exposures of European banks. Compared to the EBA 2016 stress test our method produces scenarios which are equally plausible as the EBA stress scenario but yield considerably worse system wide losses.
With Cloud Computing and multi-core CPUs parallel computing resources are becoming more and more affordable and commonly available. Parallel programming should as well be easily accessible for everyone. Unfortunately, existing frameworks and systems are powerful but often very complex to use for anyone who lacks the knowledge about underlying concepts. This paper introduces a software framework and execution environment whose objective is to provide a system which should be easily usable for everyone who could benefit from parallel computing. Some real-world examples are presented with an explanation of all the steps that are necessary for computing in a parallel and distributed manner.
Blood flow and ventilatory flow strongly influence the concentrations of volatile organic compounds (VOCs) in exhaled breath. The physicochemical properties of a compound (e.g., water solubility) additionally determine if the concentration of the compound in breath reflects the alveolar concentration, the concentration in the upper airways, or a mixture of both. Mathematical modeling based on mass balance equations helps to understand how measured breath concentrations are related to their corresponding blood concentrations and physiological parameters, such as metabolic rates and endogenous production rates. In addition, the influence of inhaled compounds on their exhaled concentrations can be quantified and appropriate correction formulas can be derived. Isoprene and acetone, two endogenous VOCs with very different water solubility, have been modeled to explain the essential features of their behavior in breath. This chapter introduces the theory of physiological modeling of exhaled VOCs, with examples of isoprene and acetone.
Post-operative isoflurane has been observed to be present in the end-tidal breath of patients who have undergone major surgery, for several weeks after the surgical procedures. A major new noncontrolled, non-randomized, and open-label approved study will recruit patients undergoing various surgeries under different inhalation anaesthetics, with two key objectives, namely to record the washout characteristics following surgery, and to investigate the influence of a patient’s health and the duration and type of surgery on elimination. In preparation for this breath study using proton transfer reaction time-of-flight mass spectrometry (PTR-TOF-MS), it is important to identify first the analytical product ions that need to be monitored and under what operating conditions. In this first paper of this new research programme, we present extensive PTR-TOF-MS studies of three major
anaesthetics used worldwide, desflurane (CF3CHFOCHF2), sevoflurane ((CF3)2CHOCH2F), and isoflurane (CF3CHClOCHF2) and a fourth one, which is used less extensively, enflurane (CHF2OCF2CHFCl), but is of interest because it is an isomer of isoflurane. Product ions are identified as a function of reduced electric field (E/N) over the range of approximately 80 Td to 210 Td, and the effects of operating the drift tube under ‘normal’ or ‘humid’ conditions on the intensities of the product ions are presented. To aid in the analyses, density functional theory (DFT) calculations of the proton affinities and the gas-phase basicities of the anaesthetics have been determined. Calculated energies for the ion-molecule reaction pathways leading to key product ions, identified as ideal for monitoring the inhalation anaesthetics in breath with a high sensitivity and selectivity, are also presented.
Wer wünscht ihn nicht: den intelligenten, effizienten und wirtschaftlichen Herstellungsprozess? Viele Firmen setzten aktuell auf die Digitalisierung und verbessern so die eigene sowie die mit externen Stellen vernetzte Produktion. Die Digitalisierung bringt einerseits Fortschritt, zeigt aber auch die zunehmende Komplexität der heutigen Produktionsnetzwerke auf. Zahlreiche Entscheidungen sind zu fällen, um einen effizienten und sicheren Austausch mit verschiedenen Betrieben zu gewährleisten.
Ein Blick auf vorhandene Modelle kann da weiterhelfen: Im Projekt i4Production des IBH-Labs KMUdigital haben Teams an drei Standorten in den drei Nachbarländern Deutschland (HTWG Konstanz), Österreich (FH Vorarlberg) und der Schweiz (NTB Buchs, RhySearch) an einer vernetzten Prozesslandschaft gearbeitet. In einem gemeinsamen, standardisierten Automatisierungskonzept wird in der international vernetzten Modellfabrik ein cyberphysisches System (CPS) in Form eines kundenindividualisierten Modellfahrzeuges produziert, das durch den Kunden in diversen Varianten zusammengestellt oder individuell konstruiert werden kann. Die dezentrale Produktion erlaubt eine Datenweitergabe über die Landesgrenzen in Echtzeit und bildet die Simulation eines länderübergreifenden Business-Eco-Systems ab.
Die Erkenntnisse des Projekts i4Production zeigen wie in kleineren und mittleren Unternehmen (KMU) eine verteilte Produktion, inklusive der Einbindung von Mitarbeitenden und Kunden in eine digitalisierte, hochautomatisierte und kundenindividuelle Produktion, organisiert werden kann.
Für Unternehmen wird diese Industrie 4.0-Prozesslandschaft als Modell für die eigene Fertigung in dem neu aufgebauten CNC Präzisionsfertigungslabor „Werkstatt4“ bei RhySearch öffentlich zur Verfügung gestellt. Die „Werkstatt4“ bietet KMU ein digitales Prozessumfeld, in dem getestet werden kann, mit welchen Maßnahmen der eingangs gestellte Wunsch zur optimierten Herstellung, seinen Weg in die Realität finden kann.
Im Folgenden stellen wir Ihnen das Konzept der internationalen Musterfabrik i4Production, die diversen Arbeitsschritte an den beteiligten Hochschulen sowie die wichtigsten Erkenntnisse für KMU der Bodenseeregion vor. Gerne unterstützen wir Sie bei der Gestaltung des Wandels hin zum Unternehmen 4.0: Sprechen Sie uns an.
With the emergence of the recent Industry 4.0 movement, data integration is now also being driven along the production line, made possible primarily by the use of established concepts of intelligent supply chains, such as the digital avatars. Digital avatars – sometimes also called Digital Twins or more broadly Cyber-Physical Systems (CPS) – are already successfully used in holistic systems for intelligent transport ecosystems, similar to the use of Big Data and artificial intelligence technologies interwoven with modern production and supply chains. The goal of this paper is to describe how data from interwoven, autonomous and intelligent supply chains can be integrated into the diverse data ecosystems of the Industry 4.0, influenced by a multitude of data exchange formats and varied data schemas. In this paper, we describe how a framework for supporting SMEs was established in the Lake Constance region and describe a demonstrator sprung from the framework. The demonstrator project’s goal is to exhibit and compare two different approaches towards optimisation of manufacturing lines. The first approach is based upon static optimisation of production demand, i.e. exact or heuristic algorithms are used to plan and optimise the assignment of orders to individual machines. In the second scenario, we use real-time situational awareness – implemented as digital avatar – to assign local intelligence to jobs and raw materials in order to compare the results to the traditional planning methods of scenario one. The results are generated using event-discrete simulation and are compared to common (heuristic) job scheduling algorithms.
Complementarities and synergies of quadruple helix innovation design in smart city development
(2020)
Increased urbanization trends are stimulating regional needs to support transitions from urban environments to smart cities, using its holistic perspective as a source to innovation. Strong relations between smart cities, urban and regional development, are getting increased attention both at policy and implementation level, providing fertile ground for execution of the new European policy frameworks that supports quadruple helix approaches to innovation. Smart specialization strategies (RIS3) encompass such initiatives, placing ICT and collaboration between academia, industry, government, and citizen at the center of urban innovation. However, there is still lack of research on effects of such approaches to innovation, involving both quadruple helix clusters and ICT in utilizing innovation potentials for developing smart cities. This study aims to increase the understanding on how quadruple helix urban innovation strengthens competitiveness of regions by improving its local smart areas – RIS3. We identified smart specialization patterns and applied comparative benchmark between nine smallmedium sized urban regions in Central Europe. Building on these results, the study provides an overview of the effects of RIS3 strategies implemented through quadruple helix innovation clusters on competitiveness of regions and Smart City development.
A modified matrix adaptation evolution strategy with restarts for constrained real-world problems
(2020)
In combination with successful constraint handling techniques, a Matrix Adaptation Evolution Strategy (MA-ES) variant (the εMAg-ES) turned out to be a competitive algorithm on the constrained optimization problems proposed for the CEC 2018 competition on constrained single objective real-parameter optimization. A subsequent analysis points to additional potential in terms of robustness and solution quality. The consideration of a restart scheme and adjustments in the constraint handling techniques put this into effect and simplify the configuration. The resulting BP-εMAg-ES algorithm is applied to the constrained problems proposed for the IEEE CEC 2020 competition on Real-World Single-Objective Constrained optimization. The novel MA-ES variant realizes improvements over the original εMAg-ES in terms of feasibility and effectiveness on many of the real-world benchmarks. The BP-εMAg-ES realizes a feasibility rate of 100% on 44 out of 57 real-world problems and improves the best-known solution in 5 cases.
Real-time measurements of the differences in inhaled and exhaled, unlabeled and fully deuterated acetone concentration levels, at rest and during exercise, have been conducted using proton transfer reaction mass spectrometry. A novel approach to continuously differentiate between the inhaled and exhaled breath acetone concentration signals is used. This leads to unprecedented fine grained data of inhaled and exhaled concentrations. The experimental results obtained are compared with those predicted using a simple three compartment model that theoretically describes the influence of inhaled concentrations on exhaled breath concentrations for volatile organic compounds with high blood:air partition coefficients, and hence is appropriate for acetone. An agreement between the predicted and observed concentrations is obtained. Our results highlight that the influence of the upper airways cannot be neglected for volatiles with high blood:air partition coefficients, i.e. highly water soluble volatiles.
Daten werden heute oft auch als das «neue Gold» bezeichnet. Denn die letzten Jahre haben gezeigt, dass Daten die Grundlage erstaunlicher unter- nehmerischer Erfolgsgeschichten sein können. Dabei ist die Arbeit mit Daten nicht grundlegend neu. Vielmehr geht es heute im Vergleich zu früher um nahezu unendlich grossen Mengen an Daten, die im Rahmen nahezu aller denkbaren Prozesse oder Schnittstellen gesammelt, gespeichert und ausgewertet werden können. Unter anderem beinhaltet dies Maschinendaten, unternehmens- interne Prozesse oder Daten über Kunden und den Markt, welche die Grundlage für lernende Systeme (Künstliche Intelligenz) bilden. Wir können heute davon ausgehen, dass künftig nicht mehr die technische Machbarkeit, sondern die mensch- liche Vorstellungskraft die Grenzen des Möglichen definiert.
Bekannt sind vor allem etliche Erfolgsgeschichten von Grossunternehmen, die ihr Geschäft auf Daten aufbauen. Etablierte KMU sind hingegen noch zögerlicher, mit Daten zu arbeiten und diese wertschöpfend einzusetzen. Diese Broschüre geht auf die besondere Situation von KMU im Umgang mit Daten und Data Science ein. Denn auch für KMU kann es lohnend oder sogar zwingend notwendig sein, sich mit dem Thema «Data Science» zu beschäftigen. Daten und Data Science bieten grosse
Chancen, sie können aber auch zu einer Bedrohung im Wettbewerb werden. Und, zu lange warten sollten KMU nicht, die Zeit drängt. Denn Geschwindigkeit ist einer der zentralen Wettbewerbsfaktoren im digitalen Zeitalter. Das IBH-Lab KMUdigital unter- stützt KMU dabei, den herausfordernden Weg in eine digitale Zukunft schneller und einfacher zu gehen.
Diese Broschüre geht daher insbesondere auf die Rolle von Daten und Data Science für KMU in der Bodenseeregion ein. Sie stellt eine Zusammen- fassung ausgewählter Erkenntnisse und Handlungs- empfehlungen dar, die wir in einem zweijährigen Forschungsprojekt gemeinsam mit 16 Unternehmen aus der Bodenseeregion gewinnen konnten. Die Erkenntnisse sollen KMU bei der Nutzung von Daten anhand von Data Science unterstützen. Dabei ist es kein Ziel, dass KMU zu einem «kleinen Google» werden. Vielmehr braucht es KMU- spezifische Lösungen und Überlegungen, wie mit Daten sinnvoll, zielorientiert und ressourcen- schonend umgegangen werden kann. Wie kann das aussehen? Welche Chancen, Herausforderungen und Lösungen bieten sich KMU vor dem Hinter- grund ihrer besonderen Situation? Was muss dazu im Unternehmen verändert werden? Welche Unterschiede bestehen im Vergleich zu Gross- unternehmen auf diesem Weg?
Diese und weitere Fragen stehen im Mittelpunkt des vorliegenden Projektberichts zum Einzelprojekt «Data Science für KMU leicht gemacht» oder kurz «Data Science 4 KMU» bzw. «Data4KMU», welches unter dem Dach des IBH-Labs KMUdigital in den Jahren 2018 bis 2019 durchgeführt wurde. Dazu werden Daten und Data Science aus mehreren Perspektiven betrachtet, die nicht unabhängig voneinander sind: Strategie und Geschäftsmodell, Services und Prozesse, Leadership, HRM und
Organisation, Organisationskultur und Ganzheitichkeit, sowie Technologie. Diese Perspektiven greifen wir in den nachfolgenden Kapiteln auf.
Die vorliegende Broschüre wäre ohne die wertvolle Unterstützung der Praxispartner des Projektes, des Managements des IBH-Labs KMUdigital sowie ohne die finanzielle Projektförderung durch die Inter- nationale Bodenseehochschule (IBH) und Interreg nicht möglich gewesen. Ihnen allen gilt unser ganz besonderer Dank!
ÖMG Conference 2019
(2019)
On the extension of digital ecosystems for SCM and customs with distributed ledger technologies
(2019)
Global supply chains represent the backbone of the modern manufacturing industry. Planning of global supply chains still represents a major hurdle, mainly because of the high complexity and unforeseen disruptions that have to be mastered for meeting the different logistics windows in a globally distributed production environment. Trust in supply chains is an additional challenge. A major – albeit sometimes overlooked - part of Supply Chain Management (SCM) is the management and integration of customs processes, clearing of tariffs, (re-)billing of customers, and fulfilling other legal requirements related to crossing borders, ranging from environmental standards over goods inspection to general paper work. With the exception of work offered by the World Customs Organization (WCO) the issue of customs and blockchain is still underrepresented in research and practice. In this paper, we look at innovations that drive the current ICTenabled SCM research and how these can be combined with smart customs management. After a literature review and introduction to the state-of-the-art, we list potential trust-based innovations for SCM and customs in digital business ecosystems. Based upon the innovations we also describe a requirements analysis of existing distributed ledger technologies (requirements for system layout, system configuration, system governance). A description of the prototype for the Lake Constance region – on which we are currently working – concludes the paper.
In engineering design, optimization methods are frequently used to improve the initial design of a product. However, the selection of an appropriate method is challenging since many
methods exist, especially for the case of simulation-based optimization. This paper proposes a systematic procedure to support this selection process. Building upon quality function deployment, end-user and design use case requirements can be systematically taken into account via a decision
matrix. The design and construction of the decision matrix are explained in detail. The proposed
procedure is validated by two engineering optimization problems arising within the design of box-type boom cranes. For each problem, the problem statement and the respectively applied optimization methods are explained in detail. The results obtained by optimization validate the use
of optimization approaches within the design process. The application of the decision matrix shows the successful incorporation of customer requirements to the algorithm selection.
Analysis of the (μ/μI,λ)-CSA-ES with repair by projection applied to a conically constrained problem
(2019)
Stress testing is part of today’s bank risk management and often required by the governing regulatory authority. Performing such a stress test with stress scenarios derived from a distribution, instead of pre-defined expert scenarios, results in a systematic approach in which new severe scenarios can be discovered. The required scenario distribution is obtained from historical time series via a Vector-Autoregressive time series model. The worst-case search, i.e. finding the scenario yielding the most severe situation for the bank, can be stated as an optimization problem. The problem itself is a constrained optimization problem in a high-dimensional search space. The constraints are the box constraints on the scenario variables and the plausibility of a scenario.
The latter is expressed by an elliptic constraint. As the evaluation of the stress scenarios is performed with a simulation tool, the optimization problem can be seen as black-box optimization problem. Evolution Strategy, a well-known optimizer for black-box problems, is applied here. The necessary adaptations to the algorithm are explained and a set of different algorithm design choices are investigated. It is shown that a simple box constraint handling method, i.e. setting variables which violate a box constraint to the respective boundary of the feasible domain, in combination with a repair of implausible scenarios provides good results.
Breath analysis holds great promise for real-time and non-invasive medical diagnosis. Thus, there is a considerable need for simple-in-use and portable analyzers for rapid detection of breath indicators for different diseases in their early stages. Sensor technology meets all of these demands. However, miniaturized breath analyzers require adequate breath sampling methods. In this context, we propose non-contact sampling; namely the collection of breath samples by exhalation from a distance into a miniaturized collector without bringing the mouth into direct contact with the analyzing device. To evaluate this approach different breathing maneuvers have been tested in a real-time regime on a cohort of 23 volunteers using proton transfer reaction mass spectrometry. The breathing maneuvers embraced distinct depths of respiration, exhalation manners, size of the mouth opening and different sampling distances. Two inhalation modes (normal, relaxed breathing and deep breathing) and two exhalation manners (via smaller and wider lips opening) forming four sampling scenarios were selected. A sampling distance of approximately 2 cm was found to be a reasonable trade-off between sample dilution and requirement of no physical contact of the subject with the analyzer. All four scenarios exhibited comparable measurement reproducibility spread of around 10%. For normal, relaxed inspiration both dead-space and end-tidal phases of exhalation lasted approximately 1.5 s for both expiration protocols. Deep inhalation prolongs the end-tidal phase to about 3 s in the case of blowing via a small lips opening, and by 50% when the air is exhaled via a wide one. In conclusion, non-contact breath sampling can be considered as a promising alternative to the existing breath sampling methods, being relatively close to natural spontaneous breathing.
Adult muscle carnitine palmitoyltransferase (CPT) II deficiency is a rare autosomal recessive disorder of long-chain fatty acid metabolism. It is typically associated with recurrent episodes of exercise-induced rhabdomyolysis and myoglobinuria, in most cases caused by a c.338C > T mutation in the CPT2 gene. Here we present the pedigree of one of the largest family studies of CPT II deficiency caused by the c.338C > T mutation, documented so far. The pedigree comprises 24 blood relatives
of the index patient, a 32 year old female with genetically proven CPT II deficiency. In total, the mutation was detected in 20 family members, among them five homozygotes and 15 heterozygotes. Among all homozygotes, first symptoms of CPT II deficiency occurred during childhood. Additionally, two already deceased relatives of the index patient were carriers of at least one copy of the genetic variant, revealing a remarkably high prevalence of the c.338C > T mutation within the tested family. Beside the index patient, only one individual had been diagnosed with CPT II deficiency prior to this study and three cases of CPT II deficiency were newly detected by this family study, pointing
to a general underdiagnosis of the disease. Therefore, this study emphasizes the need to raise awareness of CPT II deficiency for correct diagnosis and accurate management of the disease.
Verstärkt der Handel mit Agrar-Derivaten die Preisschwankungen von Agrar-Produkten? In der politischen Diskussion wird diese These oft als Grund für eine strenge Reglementierung des Handels mit Agrar-Derivaten angeführt. Hier diskutiere ich die Voraussetzungen, auf denen verschiedene Argumente für diese These beruhen. Eine zentrale Rolle dabei spielen die Begriffe von Gleichgewicht und Selbstreferenz. Diese Begriffe spielen in der Logik und der Physik eine wichtige Rolle, haben aber in der Ökonomie erstaunliche Konsequenzen.
A multi-recombinative active matrix adaptation evolution strategy for constrained optimization
(2019)
Product ion distributions resulting from the primary reactions of H3O+ with nine D-labeled volatile organic compounds and the subsequent sequential reactions with H2O have been determined using a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF 8000 (IONICON Analytik GmbH)) at various reduced electric field (E/N) values ranging from 80 up to 150 Td and for two different absolute humidity levels of air sample < 0.1% and 5%. The specific D-labeled compounds used in this study are acetone-d6, toluene-d8, benzene-d6, ethanol-d (C2H5OD), ethanol-d2 (CH3CD2OH), ethanol-d6, 2-propanol-d8, 2-propanol-d3 (CD3CH(OH)CH3), and isoprene-d5 (CH2CHC(CD2)CD3). With the exception of the two 2-propanol compounds, non-dissociative proton transfer is the dominant primary reaction pathway. For 2-propanol-d8 and 2-propanol-d3 the major primary reaction channel involved is dissociative proton transfer. However, unlike their undeuterated counterparts, the primary product ions undergo subsequent deuterium/hydrogen isotope exchange reactions with the ever present water in the drift tube, the extent of which of course depends on the humidity within that tube. This exchange leads to the generation of various isotopologue product ions, the product ion branching percentages of which are also
dependent on the humidity in the drift tube. This results in complex mass spectra and the distribution of product ions leads to issues of reduced sensitivity and accuracy. However, the effect of D/H exchange considerably varies between the compounds under study. In the case of acetone-d6 it is very weak (<1%), because the exchange process is not facile when the deuterium is in the methyl functional group. In comparison, the H3O+/ benzene-d6 (C6D6) reaction and sequential reactions with water result in the production of the isotopologue ions C6Dn(H7-n)+ (where n = 0–6). Changing the value of E/N and/or the humidity in the drift tube considerably affects the amount of the isotope exchange reactions and hence the resulting sequential product ion distributions. An important conclusion of the findings from this work is that care must be taken in the choice of an exogenous deuterated compound for use in breath pharmacokinetic studies using proton transfer reaction mass spectrometry; otherwise the resulting D/H exchange processes impose interpretative problems.
© 2018 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/by/4.0/).
Breath analysis offers a non-invasive and rapid diagnostic method for detecting various volatile organic compounds that could be indicators for different diseases, particularly metabolic disorders including type 2 diabetes mellitus. The development of type 2 diabetes mellitus is closely linked to metabolic dysfunction of adipose tissue and adipocytes. However, the VOC profile of human adipocytes has not yet been investigated. Gas chromatography with mass spectrometric detection and head-space needle trap extraction (two-bed Carbopack X/Carboxen 1000 needle traps) were applied to profile VOCs produced and metabolised by human Simpson Golabi Behmel Syndrome adipocytes. In total, sixteen compounds were identified to be related to the metabolism of the cells. Four sulphur compounds (carbon disulphide, dimethyl sulphide, ethyl methyl sulphide and dimethyl disulphide), three heterocyclic compounds (2-ethylfuran, 2-methyl-5-(methyl-thio)-furan, and 2-pentylfuran), two ketones (acetone and 2-pentanone), two hydrocarbons (isoprene and n-heptane) and one ester (ethyl acetate) were produced, and four aldehydes (2-methyl-propanal, butanal, pentanal and hexanal) were found to be consumed by the cells of interest. This study presents the first profile of VOCs formed by human adipocytes, which may reflect the activity of the adipose tissue enzymes and provide evidence of their active role in metabolic regulation. Our data also suggest that a previously reported increase of isoprene and sulphur compounds in diabetic patients may be explained by their production by adipocytes. Moreover, the unique features of this profile, including a high emission of dimethyl sulphide and the production of furan-containing VOCs, increase our knowledge about metabolism in adipose tissue and provide diagnostic potential for future applications.
A covariance matrix self-adaptation evolution strategy for optimization under linear constraints
(2018)