952 resultados para Field Analysis Comfa


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this paper is to promote qualitative methodology within the scientific community of management. The specific objective is oriented to propose an empirical research process based on case study method. This is to ensure rigor in the empirical research process, that future research may follow a similar procedure to that is proposed. Design/methodology/approach: Following a qualitative methodological approach, we propose a research process that develops according to four phases, each with several stages. This study analyses the preparatory and field work phases and their stages. Findings: The paper shows the influence that case studies have on qualitative empirical research process in management. Originality/value:. Case study method assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quinoxalina e seus derivativos são uma importante classe de compostos heterocíclicos, onde os elementos N, S e O substituem átomos de carbono no anel. A fórmula molecular da quinoxalina é C8H6N2, formada por dois anéis aromáticos, benzeno e pirazina. É rara em estado natural, mas a sua síntese é de fácil execução. Modificações na estrutura da quinoxalina proporcionam uma grande variedade de compostos e actividades, tais como actividades antimicrobiana, antiparasitária, antidiabética, antiproliferativa, anti-inflamatória, anticancerígena, antiglaucoma, antidepressiva apresentando antagonismo do receptor AMPA. Estes compostos também são importantes no campo industrial devido, por exemplo, ao seu poder na inibição da corrosão do metal. A química computacional, ramo natural da química teórica é um método bem desenvolvido, utilizado para representar estruturas moleculares, simulando o seu comportamento com as equações da física quântica e clássica. Existe no mercado uma grande variedade de ferramentas informaticas utilizadas na química computacional, que permitem o cálculo de energias, geometrias, frequências vibracionais, estados de transição, vias de reação, estados excitados e uma variedade de propriedades baseadas em várias funções de onda não correlacionadas e correlacionadas. Nesta medida, a sua aplicação ao estudo das quinoxalinas é importante para a determinação das suas características químicas, permitindo uma análise mais completa, em menos tempo, e com menos custos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Master Thesis in Mechanical Engineering field of Maintenance and Production

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of multicores is becoming widespread inthe field of embedded systems, many of which have real-time requirements. Hence, ensuring that real-time applications meet their timing constraints is a pre-requisite before deploying them on these systems. This necessitates the consideration of the impact of the contention due to shared lowlevel hardware resources like the front-side bus (FSB) on the Worst-CaseExecution Time (WCET) of the tasks. Towards this aim, this paper proposes a method to determine an upper bound on the number of bus requests that tasks executing on a core can generate in a given time interval. We show that our method yields tighter upper bounds in comparison with the state of-the-art. We then apply our method to compute the extra contention delay incurred by tasks, when they are co-scheduled on different cores and access the shared main memory, using a shared bus, access to which is granted using a round-robin arbitration (RR) protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usage of COTS-based multicores is becoming widespread in the field of embedded systems. Providing realtime guarantees at design-time is a pre-requisite to deploy real-time systems on these multicores. This necessitates the consideration of the impact of the contention due to shared low-level hardware resources on the Worst-Case Execution Time (WCET) of the tasks. As a step towards this aim, this paper first identifies the different factors that make the WCET analysis a challenging problem in a typical COTS-based multicore system. Then, we propose and prove, a mathematically correct method to determine tight upper bounds on the WCET of the tasks, when they are co-scheduled on different cores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral instruments have been incorporated in satellite missions, providing large amounts of data of high spectral resolution of the Earth surface. This data can be used in remote sensing applications that often require a real-time or near-real-time response. To avoid delays between hyperspectral image acquisition and its interpretation, the last usually done on a ground station, onboard systems have emerged to process data, reducing the volume of information to transfer from the satellite to the ground station. For this purpose, compact reconfigurable hardware modules, such as field-programmable gate arrays (FPGAs), are widely used. This paper proposes an FPGA-based architecture for hyperspectral unmixing. This method based on the vertex component analysis (VCA) and it works without a dimensionality reduction preprocessing step. The architecture has been designed for a low-cost Xilinx Zynq board with a Zynq-7020 system-on-chip FPGA-based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low-cost embedded systems, opening perspectives for onboard hyperspectral image processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work models the competitive behaviour of individuals who maximize their own utility managing their network of connections with other individuals. Utility is taken as a synonym of reputation in this model. Each agent has to decide between two variables: the quality of connections and the number of connections. Hence, the reputation of an individual is a function of the number and the quality of connections within the network. On the other hand, individuals incur in a cost when they improve their network of contacts. The initial value of the quality and number of connections of each individual is distributed according to an initial (given) distribution. The competition occurs over continuous time and among a continuum of agents. A mean field game approach is adopted to solve the model, leading to an optimal trajectory for the number and quality of connections for each individual.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field lab: Business project

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computerized handheld procedure is presented in this paper. It is intended as a database complementary tool, to enhance prospective risk analysis in the field of occupational health. The Pendragon forms software (version 3.2) has been used to implement acquisition procedures on Personal Digital Assistants (PDAs) and to transfer data to a computer in an MS-Access format. The data acquisition strategy proposed relies on the risk assessment method practiced at the Institute of Occupational Health Sciences (IST). It involves the use of a systematic hazard list and semi-quantitative risk assessment scales. A set of 7 modular forms has been developed to cover the basic need of field audits. Despite the minor drawbacks observed, the results obtained so far show that handhelds are adequate to support field risk assessment and follow-up activities. Further improvements must still be made in order to increase the tool effectiveness and field adequacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1903, the eastern slope of Turtle Mountain (Alberta) was affected by a 30 M m3-rockslide named Frank Slide that resulted in more than 70 casualties. Assuming that the main discontinuity sets, including bedding, control part of the slope morphology, the structural features of Turtle Mountain were investigated using a digital elevation model (DEM). Using new landscape analysis techniques, we have identified three main joint and fault sets. These results are in agreement with those sets identified through field observations. Landscape analysis techniques, using a DEM, confirm and refine the most recent geology model of the Frank Slide. The rockslide was initiated along bedding and a fault at the base of the slope and propagated up slope by a regressive process following a surface composed of pre-existing discontinuities. The DEM analysis also permits the identification of important geological structures along the 1903 slide scar. Based on the so called Sloping Local Base Level (SLBL) an estimation was made of the present unstable volumes in the main scar delimited by the cracks, and around the south area of the scar (South Peak). The SLBL is a method permitting a geometric interpretation of the failure surface based on a DEM. Finally we propose a failure mechanism permitting the progressive failure of the rock mass that considers gentle dipping wedges (30°). The prisms or wedges defined by two discontinuity sets permit the creation of a failure surface by progressive failure. Such structures are more commonly observed in recent rockslides. This method is efficient and is recommended as a preliminary analysis prior to field investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Gamma Knife surgery (GKS) is a non-invasive neurosurgical stereotactic procedure, increasingly used as an alternative to open functional procedures. This includes targeting of the ventro-intermediate nucleus of the thalamus (e.g. Vim) for tremor. We currently perform an indirect targeting, as the Vim is not visible on current 3Tesla MRI acquisitions. Our objective was to enhance anatomic imaging (aiming at refining the precision of anatomic target selection by direct visualisation) in patients treated for tremor with Vim GKS, by using high field 7T MRI. MATERIALS AND METHODSH: Five young healthy subjects were scanned on 3 (T1-w and diffusion tensor imaging) and 7T (high-resolution susceptibility weighted images (SWI)) MRI in Lausanne. All images were further integrated for the first time into the Gamma Plan Software(®) (Elekta Instruments, AB, Sweden) and co-registered (with T1 was a reference). A simulation of targeting of the Vim was done using various methods on the 3T images. Furthermore, a correlation with the position of the found target with the 7T SWI was performed. The atlas of Morel et al. (Zurich, CH) was used to confirm the findings on a detailed analysis inside/outside the Gamma Plan. RESULTS: The use of SWI provided us with a superior resolution and an improved image contrast within the basal ganglia. This allowed visualization and direct delineation of some subgroups of thalamic nuclei in vivo, including the Vim. The position of the target, as assessed on 3T, perfectly matched with the supposed one of the Vim on the SWI. Furthermore, a 3-dimensional model of the Vim-target area was created on the basis of the obtained images. CONCLUSION: This is the first report of the integration of SWI high field MRI into the LGP, aiming at the improvement of targeting validation of the Vim in tremor. The anatomical correlation between the direct visualization on 7T and the current targeting methods on 3T (e.g. quadrilatere of Guyot, histological atlases) seems to show a very good anatomical matching. Further studies are needed to validate this technique, both by improving the accuracy of the targeting of the Vim (potentially also other thalamic nuclei) and to perform clinical assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study assesses gender differences in spatial and non-spatial relational learning and memory in adult humans behaving freely in a real-world, open-field environment. In Experiment 1, we tested the use of proximal landmarks as conditional cues allowing subjects to predict the location of rewards hidden in one of two sets of three distinct locations. Subjects were tested in two different conditions: (1) when local visual cues marked the potentially-rewarded locations, and (2) when no local visual cues marked the potentially-rewarded locations. We found that only 17 of 20 adults (8 males, 9 females) used the proximal landmarks to predict the locations of the rewards. Although females exhibited higher exploratory behavior at the beginning of testing, males and females discriminated the potentially-rewarded locations similarly when local visual cues were present. Interestingly, when the spatial and local information conflicted in predicting the reward locations, males considered both spatial and local information, whereas females ignored the spatial information. However, in the absence of local visual cues females discriminated the potentially-rewarded locations as well as males. In Experiment 2, subjects (9 males, 9 females) were tested with three asymmetrically-arranged rewarded locations, which were marked by local cues on alternate trials. Again, females discriminated the rewarded locations as well as males in the presence or absence of local cues. In sum, although particular aspects of task performance might differ between genders, we found no evidence that women have poorer allocentric spatial relational learning and memory abilities than men in a real-world, open-field environment.