Refine
Year of publication
Document Type
- Article (67)
- Conference Proceeding (22)
- Report (14)
- Part of a Book (4)
- Master's Thesis (3)
- Book (2)
- Doctoral Thesis (2)
- Other (2)
- Working Paper (2)
Institute
- Forschungszentrum Business Informatics (27)
- Forschungszentrum Mikrotechnik (25)
- Forschungszentrum Energie (19)
- Soziales & Gesundheit (15)
- Technik | Engineering & Technology (13)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (12)
- Forschungszentrum Human Centred Technologies (11)
- Forschungsgruppe Empirische Sozialwissenschaften (9)
- Forschungszentrum Digital Factory Vorarlberg (5)
- Wirtschaft (5)
Language
- English (84)
- German (33)
- Multiple languages (1)
Has Fulltext
- yes (118) (remove)
Is part of the Bibliography
- yes (118) (remove)
Keywords
- Global optimization (3)
- Kultur (3)
- Partizipation (3)
- Peripheral arterial disease (3)
- Pflege (3)
- Rastrigin function (3)
- Y-branch splitter (3)
- Bubble column humidifier (2)
- Cloud manufacturing (2)
- Demand response (2)
In this paper, we propose and simulate a new type of three-dimensional (3D) optical splitter based on multimode interference (MMI) for the wavelength of 1550 nm. The splitter was proposed on the square basis with the width of 20 x 20 µm2 using the IP-Dip polymer as a standard material for 3D laser lithography. We present the optical field distribution in the proposed MMI splitter and its integration possibility on optical fiber. The design is aimed to the possible fabrication process using the 3D laser lithography for forthcoming experiments.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
The Digital Factory Vorarlberg is the youngest Research Center of Vorarlberg University of Applied Sciences. In the lab of the research center a research and learning factory has been established for educating students and employees of industrial partners. Showcases and best practice scenarios for various topics of digitalization in the manufacturing industry are demonstrated. In addition, novel methods and technologies for digital production, cloud-based manufacturing, data analytics, IT- and OT-security or digital twins are being developed. The factory comprises only a minimum core of logistics and fabrication processes to guarantee manageability within an academic setup. As a product, fidget spinners are being fabricated. A webshop allows customers to individually design their products and directly place orders in the factory. A centralized SCADA-System is the core data hub for the factory. Various data analytic tools and methods and a novel database for IoT-applications are connected to the SCADA-System. As an alternative to on premise manufacturing, orders can be pushed into a cloud-based manufacturing platform, which has been developed at the Digital Factory. A broker system allows fabrication in distributed facilities and offers various optimization services. Concepts, such as outsourcing product configuration to customers or new types of engineering services in cloud-based manufacturing can be explored and demonstrated. In this paper, we present the basic concept of the Digital Factory Vorarlberg, as well as some of the newly developed topics.
In engineering design, optimization methods are frequently used to improve the initial design of a product. However, the selection of an appropriate method is challenging since many
methods exist, especially for the case of simulation-based optimization. This paper proposes a systematic procedure to support this selection process. Building upon quality function deployment, end-user and design use case requirements can be systematically taken into account via a decision
matrix. The design and construction of the decision matrix are explained in detail. The proposed
procedure is validated by two engineering optimization problems arising within the design of box-type boom cranes. For each problem, the problem statement and the respectively applied optimization methods are explained in detail. The results obtained by optimization validate the use
of optimization approaches within the design process. The application of the decision matrix shows the successful incorporation of customer requirements to the algorithm selection.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
Traditional power grids are mainly based on centralized power generation and subsequent distribution. The increasing penetration of distributed renewable energy sources and the growing number of electrical loads is creating difficulties in balancing supply and demand and threatens the secure and efficient operation of power grids. At the same time, households hold an increasing amount of flexibility, which can be exploited by demand-side management to decrease customer cost and support grid operation. Compared to the collection of individual flexibilities, aggregation reduces optimization complexity, protects households’ privacy, and lowers the communication effort. In mathematical terms, each flexibility is modeled by a set of power profiles, and the aggregated flexibility is modeled by the Minkowski sum of individual flexibilities. As the exact Minkowski sum calculation is generally computationally prohibitive, various approximations can be found in the literature. The main contribution of this paper is a comparative evaluation of several approximation algorithms in terms of novel quality criteria, computational complexity, and communication effort using realistic data. Furthermore, we investigate the dependence of selected comparison criteria on the time horizon length and on the number of households. Our results indicate that none of the algorithms perform satisfactorily in all categories. Hence, we provide guidelines on the application-dependent algorithm choice. Moreover, we demonstrate a major drawback of some inner approximations, namely that they may lead to situations in which not using the flexibility is impossible, which may be suboptimal in certain situations.
Akademisierung der Pflege
(2023)
Akademisierung der Pflege
Überführungsprozess der sekundären Pflegeausbildung in den tertiären Bildungssektor
Hintergrund:
In der Entwicklung von tertiären Bildungsangeboten in der Pflege gelten angloamerikanische und
nordeuropäische Länder als Vorreiter. Im Vergleich dazu setzte die Tertiärisierung der Pflegeausbildung im deutschsprachigen Raum erst verzögert ein. In Österreich waren, ähnlich der Schweiz und Deutschland, berufsqualifizierende Abschlüsse der Pflegeausbildung traditionell nicht im Hochschulbereich verankert, sondern Gesundheits- und Krankenpflege-Schulen übernahmen die berufsqualifizierende Pflegeausbildung auf sekundärem Bildungsniveau. Mit der Novellierung des Gesundheits- und Krankenpflegegesetztes im Jahr 2016 dekretierte Österreich im Rahmen einer umfassenden Ausbildungsreform und mit einer Übergangsfrist bis 2024 die vollständige Ausbildungsüberführung der sekundären Pflegeausbildung (gehobener Dienst) in den tertiären Hochschulsektor. Bei diesem Transformationsprozess treffen nicht nur unterschiedliche Institutionen, Organisationen und Ausbildungskulturen aufeinander, sondern auch verschiedene Systempartner*innen mit divergierenden Interessen, Motiven und Erwartungshaltungen, die zur Dynamik und Komplexität der Ausbildungsüberführung beitragen.
Ziel:
Vor dem theoretischen Hintergrund der Professionssoziologie konzentriert sich diese Dissertation auf das empirische Phänomen des Überführungsprozesses der sekundären Pflegeausbildung (gehobener Dienst) in den tertiären Bildungssektor am Beispiel der Entwicklung in Vorarlberg (Österreich). Für die theoretische Einbettung bzw. Interpretation des empirischen Praxisbeispiels dient der professionstheoretische Ansatz von Andrew Abbott, um die Ausbildungstransformation als komplexen und dynamischen Prozess mit unterschiedlichen Perspektiven, Interessen und Ansprüchen jener Systempartner*innen zu verstehen, die den Transformationsprozess verantworten und intensiv in die Umsetzung der Ausbildungsüberführung involviert sind. Durch die Anwendung von Abbotts Theorieansatz auf den Fall der Akademisierung der Pflegeausbildung leistet diese Studie einen Beitrag zur Fachliteratur der Professionalisierung der Pflege.
Methodik:
Zur Exploration dient eine qualitative Einzelfallstudie, die sich über fünf Jahre erstreckt und multiple Datenquellen (Datentriangulation) nutzt, um unterschiedliche Daten, Informationen und Perspektiven aus Einzel- und Gruppeninterviews, Dokumenten und Forschungstagebuch zusammenzuführen. Die Verwendung unterschiedlicher Datenquellen dient der Strategie, ein tiefes und fundiertes Verständnis des untersuchten Forschungsphänomens zu erlangen.
Ergebnis:
Als Beitrag zur bestehenden Fachliteratur erläutert diese Studie nicht nur den Prozessverlauf mit seinen Meilensteinen, Merkmalen und Ereignissen, sondern zeigt, dass unterschiedliche Interessen, Ansprüche und Hintergründe der verantwortlichen Systempartner*innen den Transformationsprozess und dessen Ausrichtung wesentlich beeinflussen. Divergierende Erwartungshaltungen, Motive und Ansprüche prägen nicht nur Verlauf und Ausrichtung der Überführung, sondern dienen auch als Erklärungsansatz, wie die beteiligten Systempartner*innen agieren, ihre Entscheidungen bzw. Handlungen im Transformationsprozess verantworten und somit zur Dynamik und Komplexität der Ausbildungsüberführung beitragen. Des Weiteren zeigt diese Studie in Anbetracht des österreichischen Bildungskontextes, dass sich der Transformationsprozess der Bachelor-Einführung nicht nur isoliert, sondern eingebettet als Teil einer übergreifenden Ausbildungsreform analysieren lässt, da die Tertiärisierung der Pflegeausbildung weitere Mechanismen und Reaktionen auslöst, die den Ausbildungs- und Berufssektor der Pflege gesamthaft und langfristig betreffen.
Schlussfolgerung:
Die Tertiärisierung der Pflegeausbildung stellt einen langfristigen Transformationsprozess dar, der nicht mit dem Berufseinstieg der Bachelor-Absolvent*innen oder mit einer gesetzlich definierten Überführungsfrist abgeschlossen ist. Die Ausbildungsüberführung sollte im Sinne Abbotts als komplexen und dynamischen Transformationsprozess verstanden werden, der auf verschiedenen Handlungsebenen und mit unterschiedlichen Interessen, Motiven und Ansprüchen stattfindet. Die Akademisierung der Pflegeausbildung erfordert daher einen langfristigen Zeithorizont in der Umsetzung und Prozessbegleitung, um den Systemwechsel auf mehreren Ebenen im Bildungs- und Berufssektor umzusetzen und einen Bewusstseinswandel bei den involvierten Stakeholdergruppen zu erreichen. Dadurch zeichnen sich weiterführende Forschungspotenziale ab.
Schlagwörter:
Akademisierung, Akademisierungsprozess, Transformation, Überführung, Prozess Pflege, Pflegeausbildung, Pflegestudium, Professionalisierung, Professionssoziologie, Professionstheorien, Andrew Abbott,
Einzelfallstudie, Datentriangulation
Bubble column humidifiers (BCHs) are frequently used for the humidification of air in various water treatment applications. A potential but not yet profoundly investigated application of such devices is the treatment of oily wastewater. To evaluate this application, the accumulation of an oil-water emulsion using a BCH is experimentally analyzed. The amount of evaporating water vapor can be evaluated by measuring the humidity ratio of the outlet air. However, humidity measurements are difficult in close to saturated conditions, as the formation of liquid droplets on the sensor impacts the measurement accuracy. We use a heating section after the humidifier, such that no liquid droplets are formed on the sensor. This enables us a more accurate humidity measurement. Two batch measurement runs are conducted with (1) tap water and (2) an oil-water emulsion as the respective liquid phase. The humidity measurement in high humidity conditions is highly accurate with an error margin of below 3 % and can be used to predict the oil concentration of the remaining liquid during operation. The measured humidity ratio corresponds with the removed amount of water vapor for both tap water and the accumulation of an oil-water emulsion. Our measurements show that the residual water content
in the oil-water emulsion is below 4 %.
Vast amounts of oily wastewater are byproducts of the petrochemical and the shipping industry and to this day frequently discharged into water bodies either without or after insufficient treatment. To alleviate the resulting pollution, water treatment processes are in great demand. Bubble column humidifiers (BCHs) as part of humidification–dehumidification systems are predestined for such a task, since they are insensitive to different feed liquids, simple in design and have low maintenance requirements. While humidification in a bubble column has been investigated plentiful for desalination, a systematic investigation of oily wastewater treatment is missing in literature. We filled this gap by analyzing the treatment of an oil–water emulsion experimentally to derive recommendations for future design and operation of BCHs. Our humidity measurements indicate that the air stream is always saturated after humidification for a liquid height of only 10 cm. A residual water mass fraction of 3.5 wt% is measured after a batch run of six hours. Furthermore, continuous measurements show that an increase in oil mass fraction leads to a decrease in system productivity especially for high oil mass fractions. This decrease is caused by the heterogeneity of the liquid temperature profile. A lower liquid height mitigates this heterogeneity, therefore decreasing the heat demand and improving the overall efficiency. The oil content of the produced condensate is below 15 ppm, allowing discharge into various water bodies. The results of our systematic investigation prove suitability and indicate a strong future potential for the use of BCHs in oily wastewater treatment.
Activation of heat pump flexibilities is a viable solution to support balancing the grid via Demand Side Management measures and fulfill the need for flexibility options. Aggregators as interface between prosumers, distribution system operators and balance responsible parties face the challenge due to data privacy and technical restrictions to transform prosumer information into aggregated available flexibility to enable trading thereof. Thereby, literature lacks a generic, applicable and widely accepted flexibility estimation method for heat pumps,which incorporates reduced sensor and system information, system- and demand-dependent behaviour. In this paper, we adapt and extend a method from literature, by incorporating domain knowledge to overcome reduced sensor and system information. We apply data of five real-world heat pump systems, distinguish operation modes, estimate power and energy flexibility of each single heat pump system, proof transferability of the method, and aggregate the flexibilities available to showcase a small HP pool as a proof of concept.
Arbeitspaket 3: Ausschöpfung des Innovationspotentials von smarten Technologien - FH Vorarlberg
(2022)
With the digitalisation, and the increased connectivity between manufacturing systems emerging in this context, manufacturing is shifting towards decentralised, distributed concepts. Still, for manufacturing scenarios manual input or augmentation of data is required at system boundaries. Especially in distributed manufacturing environments, like Cloud Manufacturing (CMfg) systems, constant changes to the available manufacturing resources and products pose challenges for establishing connections between them. We propose a feature-oriented representation of concepts, especially from the manufacturing domain, which serves as the basis for (semi-) automatically linking, e.g., manufacturing resources and products. This linking methodologies, as well as knowledge inferred using it, is then used to support distributed manufacturing, especially in CMfg environments, and enhance product development. The concepts and methodologies are to be evaluated in a real world learning factory.
Verbraucherseitige Laststeuerung (Demand Side Management – DSM) wird als ein möglicher Ansatz betrachtet, um die Auswirkungen des Ausbaus von fluktuierenden Erneuerbaren im Stromnetz auszugleichen. Sollen viele verteilte Energiesysteme damit angesprochen werden, stellen zentralistische Ansätze dabei hohe Anforderungen an die Kommunikationsinfrastruktur. Als Alternative wird vielfach eine autonome Laststeuerung (ADSM) mit anreizbasierter Optimierung direkt auf dem Verbrauchergerät betrachtet. Dabei kann die Anreizfunktion mittels unidirektionaler Kommunikation übertragen werden.
Am Forschungszentrum Energie der Fachhochschule Vorarlberg wurden in den letzten Jahren Algorithmen und Prototypen für den Einsatz von ADSM auf verschiedensten verteilten Energiespeichern im elektrischen Stromnetz entwickelt. Dabei werden sowohl thermische Energiespeicher (z. B. Haushalts-Warmwasserspeicher) als auch elektrochemische Speicher (z. B. Batteriespeichersysteme oder Elektroautos) betrachtet. Außerdem werden die Auswirkungen solcher Systeme auf das elektrische Verteilnetz untersucht. Dieser Artikel gibt einen Überblick über die entwickelten Methoden und Ergebnisse aus diesem Forschungsfeld mit dem Ziel, ein weitreichendes Verständnis für die Chancen und Grenzen des ADSM zu schaffen.
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
The production of liquid-gas dispersions places high demands on the process technology, which requires knowledge of the bubble formation mechanisms, as well as the phase parameters of the media combinations used. To obtain the bubble sizes introduced to a flow not knowing the phase parameters, different process parameters are investigated. Their quality and applicability are evaluated. The results obtained make it possible to simplify long design processes of dispersion processes in manufacturing plants and to ensure the product quality of the products manufactured, by reducing waste.
Business Analytics zählt zu den Zukunftsthemen im Controlling. In der Controllinglehre spielt Analytics bisher aber nur eine untergeordnete Rolle. Der Beitrag beschreibt ein innovatives Lehrprojekt, das Studierende im Masterstudium Accounting, Controlling & Finance an der FH Vorarlberg befähigt, controllingrelevante Fragestellungen im Kontext von Business Analytics eigenständig zu beantworten. Gleichzeitig erlernen die Studierenden den Umgang mit der Open-Source-Software R.
The increasing digitalisation of daily routines confronts people with frequent privacy decisions. However, obscure data processing often leads to tedious decision-making and results in unreflective choices that unduly compromise privacy. Serious Games could be applied to encourage teenagers and young adults to make more thoughtful privacy decisions. Creating a Serious Game (SG) that promotes privacy awareness while maintaining an engaging gameplay requires, however, a carefully balanced game concept. This study explores the benefits of an online role-playing boardgame as a co-designing activity for creating SGs about privacy. In a between-subjects trial, student groups and educator/researcher groups were taking the roles of player, teacher, researcher and designer to co-design a balanced privacy SG concept. Using predefined design proposal cards or creating their own, students and educators played the online boardgame during a video conference session to generate game ideas, resolve potential conflicts and balance the different SG aspects. The comparative results of the present study indicate that students and educators alike perceive support from role-playing when ideating and balancing SG concepts and are happy with their playfully co-designed game concepts. Implications for supporting SG design with role-playing in remote collaboration scenarios are conclusively synthesised.