975 resultados para Real Building Fires


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increased levels of plasma oxLDL, which is the oxidized fraction of Low Density Lipoprotein (LDL), are associated with atherosclerosis, an inflammatory disease, and the subsequent development of severe cardiovascular diseases that are today a major cause of death in modern countries. It is therefore important to find a reliable and fast assay to determine oxLDL in serum. A new immunosensor employing three monoclonal antibodies (mAbs) against oxLDL is proposed in this work as a quick and effective way to monitor oxLDL. The oxLDL was first employed to produce anti-oxLDL monoclonal antibodies by hybridoma cells that were previously obtained. The immunosensor was set-up by selfassembling cysteamine (Cyst) on a gold (Au) layer (4 mm diameter) of a disposable screen-printed electrode. Three mAbs were allowed to react with N-hydroxysuccinimide (NHS) and ethyl(dimethylaminopropyl)carbodiimide (EDAC), and subsequently incubated in the Au/Cys. Albumin from bovine serum (BSA) was immobilized further to ensure that other molecules apart from oxLDL could not bind to the electrode surface. All steps were followed by various characterization techniques such as electrochemical impedance spectroscopy (EIS) and square wave voltammetry (SWV). The analytical operation of the immunosensor was obtained by incubating the sensing layer of the device in oxLDL for 15 minutes, prior to EIS and SWV. This was done by using standard oxLDL solutions prepared in foetal calf serum, in order to simulate patient's plasma with circulating oxLDL. A sensitive response was observed from 0.5 to 18.0 mg mL 1 . The device was successfully applied to determine the oxLDL fraction in real serum, without prior dilution or necessary chemical treatment. The use of multiple monoclonal antibodies on a biosensing platform seemed to be a successful approach to produce a specific response towards a complex multi-analyte target, correlating well with the level of oxLDL within atherosclerosis disease, in a simple, fast and cheap way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microcystin-LR (MC-LR) is a dangerous toxin found in environmental waters, quantified by high performance liquid chromatography and/or enzyme-linked immunosorbent assays. Quick, low cost and on-site analysis is thus required to ensure human safety and wide screening programs. This work proposes label-free potentiometric sensors made of solid-contact electrodes coated with a surface imprinted polymer on the surface of Multi-Walled Carbon NanoTubes (CNTs) incorporated in a polyvinyl chloride membrane. The imprinting effect was checked by using non-imprinted materials. The MC-LR sensitive sensors were evaluated, characterized and applied successfully in spiked environmental waters. The presented method offered the advantages of low cost, portability, easy operation and suitability for adaptation to flow methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

6th Real-Time Scheduling Open Problems Seminar (RTSOPS 2015), Lund, Sweden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International Real-Time Ada Workshop (IRTAW 2015). 20 to 22, Apr, 2015. Pownal, U.S.A..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 30th ACM/SIGAPP Symposium On Applied Computing (SAC 2015). 13 to 17, Apr, 2015, Embedded Systems. Salamanca, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gottfried Leibniz generalized the derivation and integration, extending the operators from integer up to real, or even complex, orders. It is presently recognized that the resulting models capture long term memory effects difficult to describe by classical tools. Leon Chua generalized the set of lumped electrical elements that provide the building blocks in mathematical models. His proposal of the memristor and of higher order elements broadened the scope of variables and relationships embedded in the development of models. This paper follows the two directions and proposes a new logical step, by generalizing the concept of junction. Classical junctions interconnect system elements using simple algebraic restrictions. Nevertheless, this simplistic approach may be misleading in the presence of unexpected dynamical phenomena and requires including additional “parasitic” elements. The novel γ-junction includes, as special cases, the standard series and parallel connections and allows a new degree of freedom when building models. The proposal motivates the search for experimental and real world manifestations of the abstract conjectures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies forest fires from the perspective of dynamical systems. Burnt area, precipitation and atmospheric temperatures are interpreted as state variables of a complex system and the correlations between them are investigated by means of different mathematical tools. First, we use mutual information to reveal potential relationships in the data. Second, we adopt the state space portrait to characterize the system’s behavior. Third, we compare the annual state space curves and we apply clustering and visualization tools to unveil long-range patterns. We use forest fire data for Portugal, covering the years 1980–2003. The territory is divided into two regions (North and South), characterized by different climates and vegetation. The adopted methodology represents a new viewpoint in the context of forest fires, shedding light on a complex phenomenon that needs to be better understood in order to mitigate its devastating consequences, at both economical and environmental levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2015). 4 to 6, Mar, 2015. Turku, Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

in RoboCup 2007: Robot Soccer World Cup XI

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado em Ciências da Comunicação Variante de Cultura Contemporânea e Novas Tecnologias

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose the Distributed using Optimal Priority Assignment (DOPA) heuristic that finds a feasible partitioning and priority assignment for distributed applications based on the linear transactional model. DOPA partitions the tasks and messages in the distributed system, and makes use of the Optimal Priority Assignment (OPA) algorithm known as Audsley’s algorithm, to find the priorities for that partition. The experimental results show how the use of the OPA algorithm increases in average the number of schedulable tasks and messages in a distributed system when compared to the use of Deadline Monotonic (DM) usually favoured in other works. Afterwards, we extend these results to the assignment of Parallel/Distributed applications and present a second heuristic named Parallel-DOPA (P-DOPA). In that case, we show how the partitioning process can be simplified by using the Distributed Stretch Transformation (DST), a parallel transaction transformation algorithm introduced in [1].