977 resultados para Data Allocation
Resumo:
Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.
Resumo:
Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.
Resumo:
Relatório de Estágio apresentado para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Novos Media e Práticas Web
Resumo:
Twenty-four whole blood and serum samples were drawn from an eight year-old heart transplant child during a 36 months follow-up. EBV serology was positive for VCA-IgM and IgG, and negative for EBNA-IgG at the age of five years old when the child presented with signs and symptoms suggestive of acute infectious mononucleosis. After 14 months, serological parameters were: positive VCA-IgG, EBNA-IgG and negative VCA-IgM. This serological pattern has been maintained since then even during episodes suggestive of EBV reactivation. PCR amplified a specific DNA fragment from the EBV gp220 (detection limit of 100 viral copies). All twenty-four whole blood samples yielded positive results by PCR, while 12 out of 24 serum samples were positive. We aimed at analyzing whether detection of EBV-DNA in serum samples by PCR was associated with overt disease as stated by the need of antiviral treatment and hospitalization. Statistical analysis showed agreement between the two parameters evidenced by the Kappa test (value 0.750; p < 0.001). We concluded that detection of EBV-DNA in serum samples of immunosuppressed patients might be used as a laboratory marker of active EBV disease when a Real-Time PCR or another quantitative method is not available.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Física
Resumo:
40 Echinococcus isolates from sheep and cattle in Southern Brazil were genetically analysed in order to obtain further data on the presence of different taxa of the Echinococcus granulosus complex. Differentiation was done using a PCR technique and sequencing of mitochondrial cytochrome c oxidase subunit 1 (CO1). Most samples (38) could be allocated to the sheep strain (G1) of E. granulosus, while two samples belonged to E. ortleppi, previously known as cattle strain (G5) of E. granulosus. Due to the shorter prepatent period in dogs of the latter taxon, this records have important implications for the design of control measures in this endemic region.
Resumo:
Trabalho de Projecto apresentado como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação.
Resumo:
Presented at Work in Progress Session, IEEE Real-Time Systems Symposium (RTSS 2015). 1 to 4, Dec, 2015. San Antonio, U.S.A..
Resumo:
Project submitted as part requirement for the degree of Masters in English teaching,
Resumo:
The high penetration of distributed energy resources (DER) in distribution networks and the competitive environment of electricity markets impose the use of new approaches in several domains. The network cost allocation, traditionally used in transmission networks, should be adapted and used in the distribution networks considering the specifications of the connected resources. The main goal is to develop a fairer methodology trying to distribute the distribution network use costs to all players which are using the network in each period. In this paper, a model considering different type of costs (fixed, losses, and congestion costs) is proposed comprising the use of a large set of DER, namely distributed generation (DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehicles with capability of discharging energy to the network, which is known as vehicle-to-grid (V2G). The proposed model includes three distinct phases of operation. The first phase of the model consists in an economic dispatch based on an AC optimal power flow (AC-OPF); in the second phase Kirschen's and Bialek's tracing algorithms are used and compared to evaluate the impact of each resource in the network. Finally, the MW-mile method is used in the third phase of the proposed model. A distribution network of 33 buses with large penetration of DER is used to illustrate the application of the proposed model.
Resumo:
The incidence of Candida bloodstream infection has increased over the past years. In the Center-West region of Brazil, data on candidemia are scarce. This paper reports a retrospective analysis of 96 cases of Candida bloodstream infection at a Brazilian tertiary-care teaching hospital in the state of Mato Grosso do Sul, from January 1998 to December 2006. Demographic, clinical and laboratory data were collected from medical records and from the hospital's laboratory database. Patients' ages ranged from three days to 92 years, with 53 (55.2%) adults and 43 (44.8%) children. Of the latter, 25 (58.1%) were newborns. The risk conditions most often found were: long period of hospitalization, utilization of venous central catheter, and previous use of antibiotics. Fifty-eight (60.4%) patients died during the hospitalization period and eight (13.7%) of them died 30 days after the diagnosis of candidemia. Candida albicans (45.8%) was the most prevalent species, followed by C. parapsilosis (34.4%), C. tropicalis (14.6%) and C. glabrata (5.2%). This is the first report of Candida bloodstream infection in the state of Mato Grosso do Sul and it highlights the importance of considering the possibility of invasive Candida infection in patients exposed to risk factors, particularly among neonates and the elderly.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.