950 resultados para Objective measure
Resumo:
[EN] This academic activity has been the origin of other work that are also located in this repository. The first one is the dataset of information about the geometry of the Monastery recorded during the two years of fieldwork, then some bachelor thesis and papers are listed:
Resumo:
Without knowledge of basic seafloor characteristics, the ability to address any number of critical marine and/or coastal management issues is diminished. For example, management and conservation of essential fish habitat (EFH), a requirement mandated by federally guided fishery management plans (FMPs), requires among other things a description of habitats for federally managed species. Although the list of attributes important to habitat are numerous, the ability to efficiently and effectively describe many, and especially at the scales required, does not exist with the tools currently available. However, several characteristics of seafloor morphology are readily obtainable at multiple scales and can serve as useful descriptors of habitat. Recent advancements in acoustic technology, such as multibeam echosounding (MBES), can provide remote indication of surficial sediment properties such as texture, hardness, or roughness, and further permit highly detailed renderings of seafloor morphology. With acoustic-based surveys providing a relatively efficient method for data acquisition, there exists a need for efficient and reproducible automated segmentation routines to process the data. Using MBES data collected by the Olympic Coast National Marine Sanctuary (OCNMS), and through a contracted seafloor survey, we expanded on the techniques of Cutter et al. (2003) to describe an objective repeatable process that uses parameterized local Fourier histogram (LFH) texture features to automate segmentation of surficial sediments from acoustic imagery using a maximum likelihood decision rule. Sonar signatures and classification performance were evaluated using video imagery obtained from a towed camera sled. Segmented raster images were converted to polygon features and attributed using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999) for use in a geographical information system (GIS). (PDF contains 41 pages.)
Resumo:
In this work we clarify the relationships between riskiness, risk acceptance and bankruptcy avoidance. We distinguish between the restriction on the current wealth required to make a gamble acceptable to the decision maker and the restriction on the current wealth required to guarantee no bankruptcy if a gamble is accepted. We focus on the measure of riskiness proposed by Foster and Hart.
Resumo:
This paper presents a model-based approach for reconstructing 3D polyhedral building models from aerial images. The proposed approach exploits some geometric and photometric properties resulting from the perspective projection of planar structures. Data are provided by calibrated aerial images. The novelty of the approach lies in its featurelessness and in its use of direct optimization based on image rawbrightness. The proposed framework avoids feature extraction and matching. The 3D polyhedral model is directly estimated by optimizing an objective function that combines an image-based dissimilarity measure and a gradient score over several aerial images. The optimization process is carried out by the Differential Evolution algorithm. The proposed approach is intended to provide more accurate 3D reconstruction than feature-based approaches. Fast 3D model rectification and updating can take advantage of the proposed method. Several results and evaluations of performance from real and synthetic images show the feasibility and robustness of the proposed approach.
Resumo:
Survival from out-of-hospital cardiac arrest depends largely on two factors: early cardiopulmonary resuscitation (CPR) and early defibrillation. CPR must be interrupted for a reliable automated rhythm analysis because chest compressions induce artifacts in the ECG. Unfortunately, interrupting CPR adversely affects survival. In the last twenty years, research has been focused on designing methods for analysis of ECG during chest compressions. Most approaches are based either on adaptive filters to remove the CPR artifact or on robust algorithms which directly diagnose the corrupted ECG. In general, all the methods report low specificity values when tested on short ECG segments, but how to evaluate the real impact on CPR delivery of continuous rhythm analysis during CPR is still unknown. Recently, researchers have proposed a new methodology to measure this impact. Moreover, new strategies for fast rhythm analysis during ventilation pauses or high-specificity algorithms have been reported. Our objective is to present a thorough review of the field as the starting point for these late developments and to underline the open questions and future lines of research to be explored in the following years.
Resumo:
Although many optical fibre applications are based on their capacity to transmit optical signals with low losses, it can also be desirable for the optical fibre to be strongly affected by a certain physical parameter in the environment. In this way, it can be used as a sensor for this parameter. There are many strong arguments for the use of POFs as sensors. In addition to being easy to handle and low cost, they demonstrate advantages common to all multimode optical fibres. These specifically include flexibility, small size, good electromagnetic compatibility behaviour, and in general, the possibility of measuring any phenomenon without physically interacting with it. In this paper, a sensor based on POF is designed and analysed with the aim of measuring the volume and turbidity of a low viscosity fluid, in this case water, as it passes through a pipe. A comparative study with a commercial sensor is provided to validate the proven flow measurement. Likewise, turbidity is measured using different colour dyes. Finally, this paper will present the most significant results and conclusions from all the tests which are carried out.
Resumo:
128 p. Retirada a solicitud de la autora 03/03/2016
Resumo:
The purpose of this essay is to clarify the theoretical understanding of the concept of resilience in order to explore problems surrounding the empirical measurement and application of the concept, as well as to examine strategic examples of empirical measures and policy applications in the literature of several disciplines, fields, and professions. The examination of resilience occurs in two streams: one conceptual and one methodological. At the conceptual level, the focus will be on definitions, distinctions between resilience and related concepts, and the theoretical frameworks that underlie usage of the concept. At the empirical level, the examination of resilience will be centered on the methodological challenges associated with research on resilience as well as previous attempts to operationalize and measure resilience. (PDF contains 4 pages)
Resumo:
Cdc48/p97 is an essential, highly abundant hexameric member of the AAA (ATPase associated with various cellular activities) family. It has been linked to a variety of processes throughout the cell but it is best known for its role in the ubiquitin proteasome pathway. In this system it is believed that Cdc48 behaves as a segregase, transducing the chemical energy of ATP hydrolysis into mechanical force to separate ubiquitin-conjugated proteins from their tightly-bound partners.
Current models posit that Cdc48 is linked to its substrates through a variety of adaptor proteins, including a family of seven proteins (13 in humans) that contain a Cdc48-binding UBX domain. As such, due to the complexity of the network of adaptor proteins for which it serves as the hub, Cdc48/p97 has the potential to exert a profound influence on the ubiquitin proteasome pathway. However, the number of known substrates of Cdc48/p97 remains relatively small, and smaller still is the number of substrates that have been linked to a specific UBX domain protein. As such, the goal of this dissertation research has been to discover new substrates and better understand the functions of the Cdc48 network. With this objective in mind, we established a proteomic screen to assemble a catalog of candidate substrate/targets of the Ubx adaptor system.
Here we describe the implementation and optimization of a cutting-edge quantitative mass spectrometry method to measure relative changes in the Saccharomyces cerevisiae proteome. Utilizing this technology, and in order to better understand the breadth of function of Cdc48 and its adaptors, we then performed a global screen to identify accumulating ubiquitin conjugates in cdc48-3 and ubxΔ mutants. In this screen different ubx mutants exhibited reproducible patterns of conjugate accumulation that differed greatly from each other, pointing to various unexpected functional specializations of the individual Ubx proteins.
As validation of our mass spectrometry findings, we then examined in detail the endoplasmic-reticulum bound transcription factor Spt23, which we identified as a putative Ubx2 substrate. In these studies ubx2Δ cells were deficient in processing of Spt23 to its active p90 form, and in localizing p90 to the nucleus. Additionally, consistent with reduced processing of Spt23, ubx2Δ cells demonstrated a defect in expression of their target gene OLE1, a fatty acid desaturase. Overall, this work demonstrates the power of proteomics as a tool to identify new targets of various pathways and reveals Ubx2 as a key regulator lipid membrane biosynthesis.
Resumo:
In this thesis, we develop an efficient collapse prediction model, the PFA (Peak Filtered Acceleration) model, for buildings subjected to different types of ground motions.
For the structural system, the PFA model covers modern steel and reinforced concrete moment-resisting frame buildings (potentially reinforced concrete shear wall buildings). For ground motions, the PFA model covers ramp-pulse-like ground motions, long-period ground motions, and short-period ground motions.
To predict whether a building will collapse in response to a given ground motion, we first extract long-period components from the ground motion using a Butterworth low-pass filter with suggested order and cutoff frequency. The order depends on the type of ground motion, and the cutoff frequency depends on the building’s natural frequency and ductility. We then compare the filtered acceleration time history with the capacity of the building. The capacity of the building is a constant for 2-dimentional buildings and a limit domain for 3-dimentional buildings. If the filtered acceleration exceeds the building’s capacity, the building is predicted to collapse. Otherwise, it is expected to survive the ground motion.
The parameters used in PFA model, which include fundamental period, global ductility and lateral capacity, can be obtained either from numerical analysis or interpolation based on the reference building system proposed in this thesis.
The PFA collapse prediction model greatly reduces computational complexity while archiving good accuracy. It is verified by FEM simulations of 13 frame building models and 150 ground motion records.
Based on the developed collapse prediction model, we propose to use PFA (Peak Filtered Acceleration) as a new ground motion intensity measure for collapse prediction. We compare PFA with traditional intensity measures PGA, PGV, PGD, and Sa in collapse prediction and find that PFA has the best performance among all the intensity measures.
We also provide a close form in term of a vector intensity measure (PGV, PGD) of the PFA collapse prediction model for practical collapse risk assessment.