Refine
Year of publication
Document Type
- Conference Proceeding (307) (remove)
Institute
- Forschungszentrum Mikrotechnik (134)
- Forschungszentrum Business Informatics (60)
- Technik | Engineering & Technology (60)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (45)
- Wirtschaft (34)
- Forschungszentrum Human Centred Technologies (24)
- Forschungszentrum Energie (23)
- Didaktik (mit 31.03.2021 aufgelöst; Integration ins TELL Center) (15)
- Department of Engineering (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (12)
- Josef Ressel Zentrum für Materialbearbeitung (12)
Language
- English (307) (remove)
Is part of the Bibliography
- yes (307) (remove)
Keywords
- arrayed waveguide gratings (9)
- Y-branch splitter (6)
- integrated optics (6)
- photonics (6)
- MMI splitter (5)
- AWG design (4)
- insertion loss (4)
- Arrayed waveguide gratings (3)
- Artificial Intelligence (incl. Robotics) (3)
- Crosstalk (3)
In this paper, a 256-channel, 10-GHz arrayed waveguide gratings demultiplexer for ultra-dense wavelength division multiplexing was designed using an in-house developed tool called AWG-Parameters. The AWG demultiplexer was designed for a central wavelength of 1550 nm and the structure was simulated in PHASAR tool from Optiwave. Two different AWG designs were developed and the influence of the design parameters on the AWG performance was studied.
The paper shows concepts of optical splitting based on three dimensional (3D) optical splitters based on multimode interference principle. This paper is focused on the design, fabrication and characterization of 3D MMI splitter with formed output waveguides based on IP-Dip polymer for direct application on optical fiber. The MMI optical splitter was simulated and fabricated using direct laser writing process. Output characteristics were characterized by highly resolved near-field scanning optical microscope (NSOM) and compared with 3D MMI splitter without output waveguides.
In this paper, we document optical splitters based on Y-branch and also on MMI splitting principle. The 1×4 Y-branch splitter was prepared in 3D geometry fully from polymer approaching the single mode transmission at 1550 nm. We also prepared new concept of 1×4 MMI optical splitter. Their optical properties and character of output optical field were measured by near-field scanning optical microscope. Splitting properties and optical outputs of both splitters are very promising and increase an attractiveness of presented 3D technology and polymers.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
A modified matrix adaptation evolution strategy with restarts for constrained real-world problems
(2020)
In combination with successful constraint handling techniques, a Matrix Adaptation Evolution Strategy (MA-ES) variant (the εMAg-ES) turned out to be a competitive algorithm on the constrained optimization problems proposed for the CEC 2018 competition on constrained single objective real-parameter optimization. A subsequent analysis points to additional potential in terms of robustness and solution quality. The consideration of a restart scheme and adjustments in the constraint handling techniques put this into effect and simplify the configuration. The resulting BP-εMAg-ES algorithm is applied to the constrained problems proposed for the IEEE CEC 2020 competition on Real-World Single-Objective Constrained optimization. The novel MA-ES variant realizes improvements over the original εMAg-ES in terms of feasibility and effectiveness on many of the real-world benchmarks. The BP-εMAg-ES realizes a feasibility rate of 100% on 44 out of 57 real-world problems and improves the best-known solution in 5 cases.
Active demand side management with domestic hot water heaters using binary integer programming
(2013)
Adaptive indirect fieldoriented control of an induction machine in the armature control range
(2012)
With Cloud Computing and multi-core CPUs parallel computing resources are becoming more and more affordable and commonly available. Parallel programming should as well be easily accessible for everyone. Unfortunately, existing frameworks and systems are powerful but often very complex to use for anyone who lacks the knowledge about underlying concepts. This paper introduces a software framework and execution environment whose objective is to provide a system which should be easily usable for everyone who could benefit from parallel computing. Some real-world examples are presented with an explanation of all the steps that are necessary for computing in a parallel and distributed manner.
Bubble column humidifiers (BCHs) are frequently used for the humidification of air in various water treatment applications. A potential but not yet profoundly investigated application of such devices is the treatment of oily wastewater. To evaluate this application, the accumulation of an oil-water emulsion using a BCH is experimentally analyzed. The amount of evaporating water vapor can be evaluated by measuring the humidity ratio of the outlet air. However, humidity measurements are difficult in close to saturated conditions, as the formation of liquid droplets on the sensor impacts the measurement accuracy. We use a heating section after the humidifier, such that no liquid droplets are formed on the sensor. This enables us a more accurate humidity measurement. Two batch measurement runs are conducted with (1) tap water and (2) an oil-water emulsion as the respective liquid phase. The humidity measurement in high humidity conditions is highly accurate with an error margin of below 3 % and can be used to predict the oil concentration of the remaining liquid during operation. The measured humidity ratio corresponds with the removed amount of water vapor for both tap water and the accumulation of an oil-water emulsion. Our measurements show that the residual water content
in the oil-water emulsion is below 4 %.
In contrast to fossil energy sources, the supply by renewable energy sources likewind and photovoltaics can not be controlled. Therefore, flexibilities on the demandside of the electric power grid, like electro-chemical energy storage systems, are usedincreasingly to match electric supply and demand at all times. To control those flex-ibilities, we consider two algorithms that both lead to linear programming problems.These are solved autonomously on the demand side, i.e., by household computers.In the classic approach, an energy price signal is sent by the electric utility to thehouseholds, which, in turn, optimize the cost of consumption within their constraints.Instead of an energy price signal, we claim that an appropriate power signal that istracked in L1-norm as close as possible by the household has favorable character-istics. We argue that an interior point of the household’s feasibility region is neveran optimal price-based point but can result in a L1-norm optimal point. Thus, pricesignals can not parametrize the complete feasibility region which may not lead to anoptimal allocation of consumption.We compare the price and power tracking algorithms over a year on the base ofone-day optimizations regarding different information settings and using a large dataset of daily household load profiles. The computational task constitutes an embarrassingly parallel problem. To this end, the performance of the two parallel computation frameworks DEF [1] and Ray [2] are investigated. The Ray framework is used to run the Python applications locally on several cores. With the DEF frameworkwe execute our Python routines parallelly in a cloud. All in all, the results providean understanding of when which computation framework and autonomous algorithmwill outperform the other.
Application of various tools to design, simulate and evaluate optical demultiplexers based on AWG
(2015)
The photonic integrated circuits are required in the next generations of coherent terabit optical communications. The software tools for automated adjustment and coupling of optical fiber arrays to photonic integrated circuits has been developed. The obtained results are needed in final production phase in the technology process of photonic integrated circuits packaging.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
With the digitalisation, and the increased connectivity between manufacturing systems emerging in this context, manufacturing is shifting towards decentralised, distributed concepts. Still, for manufacturing scenarios manual input or augmentation of data is required at system boundaries. Especially in distributed manufacturing environments, like Cloud Manufacturing (CMfg) systems, constant changes to the available manufacturing resources and products pose challenges for establishing connections between them. We propose a feature-oriented representation of concepts, especially from the manufacturing domain, which serves as the basis for (semi-) automatically linking, e.g., manufacturing resources and products. This linking methodologies, as well as knowledge inferred using it, is then used to support distributed manufacturing, especially in CMfg environments, and enhance product development. The concepts and methodologies are to be evaluated in a real world learning factory.
The electricity demand due to the increasing number of EVs presents new challenges for the operation of the electricity network, especially for the distribution grids. The existing grid infrastructure may not be sufficient to meet the new demands imposed by the integration of EVs. Thus, EV charging may possibly lead to reliability and stability issues, especially during the peak demand periods. Demand side management (DSM) is a potential and promising approach for mitigation of the resulting impacts. In this work, we developed an autonomous DSM strategy for optimal charging of EVs to minimize the charging cost and we conducted a simulation study to evaluate the impacts to the grid operation. The proposed approach only requires a one way communicated incentive. Real profiles from an Austrian study on mobility behavior are used to simulate the usage of the EVs. Furthermore, real smart meter data are used to simulate the household base load profiles and a real low voltage grid topology is considered in the load flow simulation. Day-ahead electricity stock market prices are used as the incentive to drive the optimization. The results for the optimum charging strategy is determined and compared to uncontrolled EV charging. The results for the optimum charging strategy show a potential cost saving of about 30.8% compared to uncontrolled EV charging. Although autonomous DSM of EVs achieves a shift of load as pursued, distribution grid operation may be substantially affected by it. We show that in the case of real time price driven operation, voltage drops and elevated peak to average powers result from the coincident charging of vehicles during favourable time slots.
A new software tool, called AWG-Channel-Spacing, is developed to calculate accurate channel spacing of an arrayed waveguide gratings (AWG) optical multiplexer/demultiplexer. This tool has been developed with the application framework QT in the programming language C++. The tool was evaluated with a design of 20-channel 200 GHz AWG. The achieved simulated transmission characteristics prove the correct functionality of the tool.
By a simple femtosecond laser process, we fabricated metal-oxide/gold composite films for electrical and optical gas sensors. We designed a dripple wavelength AWG-spectrometer, matched to the plasma absorption wavelength region of the composite films. H2/CO absorptions fit well with the AWG design for multi gas detection sensor arrays
A new software tool, called AWG-Wuckler, is developed to calculate geometric parameters of arrayed waveguide grating structures for telecommunication and medical applications. These parameters are crucial for a AWG layout which will be created and simulated using commercial photonic design tools. The design process of AWG is very complex because its geometric dimensions depend on a large number of input design parameters and other input design parameters. Often geometric constraints require an adjustment of the input design parameters and vice versa. Calculation and adjustment of the geometric parameters is a time-consuming process that is currently not fully supported by any commercial photonic tool. AWG-Wuckler tool overcomes this issue and offers a fast and easy to use solution. The tool was already applied in various AWG designs and is technologically well proven.