943 resultados para Developers of Java system
Resumo:
East Africa’s Lake Victoria provides resources and services to millions of people on the lake’s shores and abroad. In particular, the lake’s fisheries are an important source of protein, employment, and international economic connections for the whole region. Nonetheless, stock dynamics are poorly understood and currently unpredictable. Furthermore, fishery dynamics are intricately connected to other supporting services of the lake as well as to lakeshore societies and economies. Much research has been carried out piecemeal on different aspects of Lake Victoria’s system; e.g., societies, biodiversity, fisheries, and eutrophication. However, to disentangle drivers and dynamics of change in this complex system, we need to put these pieces together and analyze the system as a whole. We did so by first building a qualitative model of the lake’s social-ecological system. We then investigated the model system through a qualitative loop analysis, and finally examined effects of changes on the system state and structure. The model and its contextual analysis allowed us to investigate system-wide chain reactions resulting from disturbances. Importantly, we built a tool that can be used to analyze the cascading effects of management options and establish the requirements for their success. We found that high connectedness of the system at the exploitation level, through fisheries having multiple target stocks, can increase the stocks’ vulnerability to exploitation but reduce society’s vulnerability to variability in individual stocks. We describe how there are multiple pathways to any change in the system, which makes it difficult to identify the root cause of changes but also broadens the management toolkit. Also, we illustrate how nutrient enrichment is not a self-regulating process, and that explicit management is necessary to halt or reverse eutrophication. This model is simple and usable to assess system-wide effects of management policies, and can serve as a paving stone for future quantitative analyses of system dynamics at local scales.
Resumo:
Psychological assessment is a central component of applied sport psychology. Despite obvious and well-documented advantages of diagnostic online tools, there is a lack of a system for such tools for sport psychologists so far in Switzerland. Having the most frequently used questionnaires available online in one single tool for all listed Swiss sport psychologists would make the work of practitioners a lot easier and less time consuming. Therefore, the main goal of this project is to develop a diagnostic online tool system with the possibility to make available different questionnaires often used in sport psychology. Furthermore, we intend to survey status and use of this diagnostic online tool system and the questionnaires by Swiss sport psychologists. A specific challenge is to limit the access to qualified sport psychologists and to secure the confidentiality for the client. In particular, approved sport psychologists get an individual code for each of their athletes for the required questionnaire. With the help of this code, athletes can access the test via a secure website at any place of the world. As soon as they complete and submit the online questionnaire, analysed and interpreted data reach the sport psychologist via E-Mail, which is timesaving and easy applicable for the sport psychologist. Furthermore, data are available for interpretation with athletes and documentation of individual development over time is possible. Later on, completed and anonymised questionnaires will be collected and analysed. Bigger number of collected data give more insight in the psychometric properties, thus helping to improve and further develop the questionnaires. In this presentation, we demonstrate the tool and its feasibility using the German version of the Test of Performance Strategies (TOPS, Schmid et al., 2010). To conclude, this diagnostic online tool system offers new possibilities for sport psychologists working as practitioner.
Resumo:
Background: Diabetes mellitus is spreading throughout the world and diabetic individuals have been shown to often assess their food intake inaccurately; therefore, it is a matter of urgency to develop automated diet assessment tools. The recent availability of mobile phones with enhanced capabilities, together with the advances in computer vision, have permitted the development of image analysis apps for the automated assessment of meals. GoCARB is a mobile phone-based system designed to support individuals with type 1 diabetes during daily carbohydrate estimation. In a typical scenario, the user places a reference card next to the dish and acquires two images using a mobile phone. A series of computer vision modules detect the plate and automatically segment and recognize the different food items, while their 3D shape is reconstructed. Finally, the carbohydrate content is calculated by combining the volume of each food item with the nutritional information provided by the USDA Nutrient Database for Standard Reference. Objective: The main objective of this study is to assess the accuracy of the GoCARB prototype when used by individuals with type 1 diabetes and to compare it to their own performance in carbohydrate counting. In addition, the user experience and usability of the system is evaluated by questionnaires. Methods: The study was conducted at the Bern University Hospital, “Inselspital” (Bern, Switzerland) and involved 19 adult volunteers with type 1 diabetes, each participating once. Each study day, a total of six meals of broad diversity were taken from the hospital’s restaurant and presented to the participants. The food items were weighed on a standard balance and the true amount of carbohydrate was calculated from the USDA nutrient database. Participants were asked to count the carbohydrate content of each meal independently and then by using GoCARB. At the end of each session, a questionnaire was completed to assess the user’s experience with GoCARB. Results: The mean absolute error was 27.89 (SD 38.20) grams of carbohydrate for the estimation of participants, whereas the corresponding value for the GoCARB system was 12.28 (SD 9.56) grams of carbohydrate, which was a significantly better performance ( P=.001). In 75.4% (86/114) of the meals, the GoCARB automatic segmentation was successful and 85.1% (291/342) of individual food items were successfully recognized. Most participants found GoCARB easy to use. Conclusions: This study indicates that the system is able to estimate, on average, the carbohydrate content of meals with higher accuracy than individuals with type 1 diabetes can. The participants thought the app was useful and easy to use. GoCARB seems to be a well-accepted supportive mHealth tool for the assessment of served-on-a-plate meals.
Resumo:
Economic historians have recently emphasized the importance of integrating economic and historical approaches in studying institutions. The literature on the Ottoman system of taxation, however, has continued to adopt a primarily historical approach, using ad hoc categories of classification and explaining the system through its continuities with the historical precedent. This paper integrates economic and historical approaches to examine the structure, efficiency, and regional diversity of the tax system. The structure of the system made it possible for the Ottomans to economize on the transaction cost of measuring the tax base. Regional variations resulted from both efficient adaptations and institutional rigidities.
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
Objective. In 2003, the State of Texas instituted the Driver Responsibility Program (TDRP), a program consisting of a driving infraction point system coupled with a series of graded fines and annual surcharges for specific traffic violations such as driving while intoxicated (DWI). Approximately half of the revenues generated are earmarked to be disbursed to the state's trauma system to cover uncompensated trauma care costs. This study examined initial program implementation, the impact of trauma system funding, and initial impact on impaired driving knowledge, attitudes and behaviors. A model for targeted media campaigns to improve the program's deterrence effects was developed. ^ Methods. Data from two independent driver survey samples (conducted in 1999 and 2005), department of public safety records, state health department data and a state auditor's report were used to evaluate the program's initial implementation, impact and outcome with respect to drivers' impaired driving knowledge, attitudes and behavior (based on constructs of social cognitive theory) and hospital uncompensated trauma care funding. Survey results were used to develop a regression model of high risk drivers who should be targeted to improve program outcome with respect to deterring impaired driving. ^ Results. Low driver compliance with fee payment (28%) and program implementation problems were associated with lower surcharge revenues in the first two years ($59.5 million versus $525 million predicted). Program revenue distribution to trauma hospitals was associated with a 16% increase in designated trauma centers. Survey data demonstrated that only 28% of drivers are aware of the TDRP and that there has been no initial impact on impaired driving behavior. Logistical regression modeling suggested that target media campaigns highlighting the likelihood of DWI detection by law enforcement and the increased surcharges associated with the TDRP are required to deter impaired driving. ^ Conclusions. Although the TDRP raised nearly $60 million in surcharge revenue for the Texas trauma system over the first two years, this study did not find evidence of a change in impaired driving knowledge, attitudes or behaviors from 1999 to 2005. Further research is required to measure whether the program is associated with decreased alcohol-related traffic fatalities. ^
Resumo:
The pollen, spore and organic walled dinoflagelletas cyst associations of two marine sediment cores from the Java Sea off the mouths of Jelai River (South Kalimantan) and Solo River (East Java) reflect environment and vegetation changes during the last ca 3500 years in the region. A decline in primary forest taxa (e.g. Agathis, Allophylus, Dacrycarpus, Dacrydium, Dipterocarpaceae, Phyllocladus, and Podocarpus) suggest that the major change in vegetation is caused by the forest canopy opening that can be related to human activity. The successively increase of pollen of pioneer canopy and herb taxa (e.g. Acalypha, Ficus, Macaranga/Mallotus, Trema, Pandanus) indicate the development of a secondary vegetation. In Java these changes started much earlier (ca at 2950 cal yr BP) then in Kalimantan (ca at 910 cal yr BP) and seem to be more severe. Changes in the marine realm, reflected by the dinoflagellate cyst association correspond to changes in vegetation on land. They reflect a gradual change from relatively well ventilated to more hypoxic bottom/pore water conditions in a more eutrophic environment. Near the coast of Java, the shift of the water trophic status took place between ca 820 and 500 cal yrs BP, while near the coast of Kalimantan it occurred as late as at the beginning of the 20th century. We observe an increasing amount of the cyst of Polykrikos schwarzii, cyst of P. kofoidii, Lingulodinium machaerophorum, Nematosphaeropsis labyrinthus and Selenopemphix nephroides at times of secondary vegetation development on land, suggesting that these species react strongly on human induced changes in the marine environment, probably related to increased pollution and eutrophication.
Resumo:
Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.
Resumo:
A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.
Resumo:
Ciao is a logic-based, multi-paradigm programming system. One of its most distinguishing features is that it supports a large number of semantic and syntactic language features which can be selectively activated or deactivated for each program module. As a result, a module can be written in, for example, ISO-Prolog plus constraints and higher order, while another can be a puré logic module with a different control rule such as iterative deepening and/or tabling, and perhaps using constructive negation. A powerful and modular extensión mechanism allows user-level design and implementation of such features and sub-languages. Another distinguishing feature of Ciao is its powerful assertion language, which allows expressing many kinds of program properties (ranging from, e.g., moded types to resource consumption), as well as tests and documentation. The compiler is capable of statically ñnding violations of these properties or verifying that programs comply with them, and issuing certiñcates of this compliance. The compiler also performs many types of optimizations, including automatic parallelization. It offers very competitive performance, while retaining the flexibility and interactive development of a dynamic language. We will present a hands-on overview of the system, through small examples which emphasize the novel aspects and the motivations which lie behind Ciao's design and implementation.
Resumo:
This article presents the design, kinematic model and communication architecture for the multi-agent robotic system called SMART. The philosophy behind this kind of system requires the communication architecture to contemplate the concurrence of the whole system. The proposed architecture combines different communication technologies (TCP/IP and Bluetooth) under one protocol designed for the cooperation among agents and other elements of the system such as IP-Cameras, image processing library, path planner, user Interface, control block and data block. The high level control is modeled by Work-Flow Petri nets and implemented in C++ and C♯♯. Experimental results show the performance of the designed architecture.
Resumo:
Renewable energy hybrid systems and mini-grids for electrification of rural areas are known to be reliable and more cost efficient than grid extension or only-diesel based systems. However, there is still some uncertainty in some areas, for example, which is the most efficient way of coupling hybrid systems: AC, DC or AC-DC? With the use of Matlab/Simulink a mini-grid that connects a school, a small hospital and an ecotourism hostel has been modelled. This same mini grid has been coupled in the different possible ways and the system’s efficiency has been studied. In addition, while keeping the consumption constant, the generation sources and the consumption profile have been modified and the effect on the efficiency under each configuration has also been analysed. Finally different weather profiles have been introduced and, again, the effect on the efficiency of each system has been observed.
Resumo:
The objective of this paper is to design a path following control system for a car-like mobile robot using classical linear control techniques, so that it adapts on-line to varying conditions during the trajectory following task. The main advantages of the proposed control structure is that well known linear control theory can be applied in calculating the PID controllers to full control requirements, while at the same time it is exible to be applied in non-linear changing conditions of the path following task. For this purpose the Frenet frame kinematic model of the robot is linearised at a varying working point that is calculated as a function of the actual velocity, the path curvature and kinematic parameters of the robot, yielding a transfer function that varies during the trajectory. The proposed controller is formed by a combination of an adaptive PID and a feed-forward controller, which varies accordingly with the working conditions and compensates the non-linearity of the system. The good features and exibility of the proposed control structure have been demonstrated through realistic simulations that include both kinematics and dynamics of the car-like robot.
Resumo:
This paper describes a novel deployment of an intelligent user-centered HVAC (Heating, Ventilating and Air Conditioner) control system. The main objective of this system is to optimize user comfort and to reduce energy consumption in office buildings. Existing commercial HVAC control systems work in a fixed and predetermined way. The novelty of the proposed system is that it adapts dynamically to the user and to the building environment. For this purpose the system architecture has been designed under the paradigm of Ambient Intelligence. A prototype of the system proposed has been tested in a real-world environment.
Resumo:
Using the Monte Carlo method the behavior of a system of true hard cylinders has been studied. Values of the length-to-breadth ratio L/D and packing fraction η have been chosen similar to those of real nematic liquid crystals. Results include radial distribution function g(r), structure factor S(k), and orientational order parameter M. These results lead to the conclusion that the hard cylinder model may be a useful reference for real mesomorphic phases.