885 resultados para Competency-Based Approach
Resumo:
Formation of hydrates is one of the major flow assurance problems faced by the oil and gas industry. Hydrates tend to form in natural gas pipelines with the presence of water and favorable temperature and pressure conditions, generally low temperatures and corresponding high pressures. Agglomeration of hydrates can result in blockage of flowlines and equipment, which can be time consuming to remove in subsea equipment and cause safety issues. Natural gas pipelines are more susceptible to burst and explosion owing to hydrate plugging. Therefore, a rigorous risk-assessment related to hydrate formation is required, which assists in preventing hydrate blockage and ensuring equipment integrity. This thesis presents a novel methodology to assess the probability of hydrate formation and presents a risk-based approach to determine the parameters of winterization schemes to avoid hydrate formation in natural gas pipelines operating in Arctic conditions. It also presents a lab-scale multiphase flow loop to study the effects of geometric and hydrodynamic parameters on hydrate formation and discusses the effects of geometric and hydrodynamic parameters on multiphase development length of a pipeline. Therefore, this study substantially contributes to the assessment of probability of hydrate formation and the decision making process of winterization strategies to prevent hydrate formation in Arctic conditions.
Resumo:
A mosaic of two WorldView-2 high resolution multispectral images (Acquisition dates: October 2010 and April 2012), in conjunction with field survey data, was used to create a habitat map of the Danajon Bank, Philippines (10°15'0'' N, 124°08'0'' E) using an object-based approach. To create the habitat map, we conducted benthic cover (seafloor) field surveys using two methods. Firstly, we undertook georeferenced point intercept transects (English et al., 1997). For ten sites we recorded habitat cover types at 1 m intervals on 10 m long transects (n= 2,070 points). Second, we conducted geo-referenced spot check surveys, by placing a viewing bucket in the water to estimate the percent cover benthic cover types (n = 2,357 points). Survey locations were chosen to cover a diverse and representative subset of habitats found in the Danajon Bank. The combination of methods was a compromise between the higher accuracy of point intercept transects and the larger sample area achievable through spot check surveys (Roelfsema and Phinn, 2008, doi:10.1117/12.804806). Object-based image analysis, using the field data as calibration data, was used to classify the image mosaic at each of the reef, geomorphic and benthic community levels. The benthic community level segregated the image into a total of 17 pure and mixed benthic classes.
Resumo:
The absence of rapid, low cost and highly sensitive biodetection platform has hindered the implementation of next generation cheap and early stage clinical or home based point-of-care diagnostics. Label-free optical biosensing with high sensitivity, throughput, compactness, and low cost, plays an important role to resolve these diagnostic challenges and pushes the detection limit down to single molecule. Optical nanostructures, specifically the resonant waveguide grating (RWG) and nano-ribbon cavity based biodetection are promising in this context. The main element of this dissertation is design, fabrication and characterization of RWG sensors for different spectral regions (e.g. visible, near infrared) for use in label-free optical biosensing and also to explore different RWG parameters to maximize sensitivity and increase detection accuracy. Design and fabrication of the waveguide embedded resonant nano-cavity are also studied. Multi-parametric analyses were done using customized optical simulator to understand the operational principle of these sensors and more important the relationship between the physical design parameters and sensor sensitivities. Silicon nitride (SixNy) is a useful waveguide material because of its wide transparency across the whole infrared, visible and part of UV spectrum, and comparatively higher refractive index than glass substrate. SixNy based RWGs on glass substrate are designed and fabricated applying both electron beam lithography and low cost nano-imprint lithography techniques. A Chromium hard mask aided nano-fabrication technique is developed for making very high aspect ratio optical nano-structure on glass substrate. An aspect ratio of 10 for very narrow (~60 nm wide) grating lines is achieved which is the highest presented so far. The fabricated RWG sensors are characterized for both bulk (183.3 nm/RIU) and surface sensitivity (0.21nm/nm-layer), and then used for successful detection of Immunoglobulin-G (IgG) antibodies and antigen (~1μg/ml) both in buffer and serum. Widely used optical biosensors like surface plasmon resonance and optical microcavities are limited in the separation of bulk response from the surface binding events which is crucial for ultralow biosensing application with thermal or other perturbations. A RWG based dual resonance approach is proposed and verified by controlled experiments for separating the response of bulk and surface sensitivity. The dual resonance approach gives sensitivity ratio of 9.4 whereas the competitive polarization based approach can offer only 2.5. The improved performance of the dual resonance approach would help reducing probability of false reading in precise bio-assay experiments where thermal variations are probable like portable diagnostics.
Telescoped approach to aryl hydroxymethylation in the synthesis of a key pharmaceutical intermediate
Resumo:
An efficient synthetic approach leading to introduction of the hydroxymethyl group to an aryl moiety via combination of the Bouveault formylation and hydride reduction has been optimized using a rational, mechanistic-based approach. This approach enabled telescoping of the two steps into a single efficient process, readily amenable to scaleup.
Resumo:
This paper introduces a normative view on corporate reputation strategic management. Reputation performance is conceptualised as the outcome of complex processes and social interactions and the lack of a holistic reputation performance management framework is identified. In an attempt to fill this gap, a portfolio-based approach is put forward. Drawing on the foundations of modern portfolio theory we create a portfolio-based reputation management algorithmic model where reputation components and priorities are weighted by decision makers and shape organisational change in an attempt to formulate a corporate reputation strategy. The rationale of this paper is based on the foundational consideration of organisations as choosing he optimal strategy by seeking to maximise their reputation performance while maintaining organisational stability and minimising organisational risk.
Resumo:
This paper introduces a normative view on corporate reputation management; an algorithmic model for reputation-driven strategic decision making is proposed and corporate reputation is conceptualized as influenced by a selection among organizational priorities. A portfolio-based approach is put forward; we draw on the foundations of portfolio theory and we create a portfolio-based reputation management model where reputation components and priorities are weighted by decision makers and shape organizational change in an attempt to formulate a corporate reputation strategy. The rationale of this paper is based on the foundational consideration of organizations as choosing the optimal strategy by seeking to maximize performance on corporate reputation capital while maintaining organizational stability and minimizing organizational risk.
Resumo:
We propose a novel skeleton-based approach to gait recognition using our Skeleton Variance Image. The core of our approach consists of employing the screened Poisson equation to construct a family of smooth distance functions associated with a given shape. The screened Poisson distance function approximation nicely absorbs and is relatively stable to shape boundary perturbations which allows us to define a rough shape skeleton. We demonstrate how our Skeleton Variance Image is a powerful gait cycle descriptor leading to a significant improvement over the existing state of the art gait recognition rate.
Resumo:
This review discusses the potential application of bacterial viruses (phage therapy) towards the eradication of antibiotic resistant Pseudomonas aeruginosa in children with cystic fibrosis (CF). In this regard, several potential relationships between bacteria and their bacteriophages are considered. The most important aspect that must be addressed with respect to phage therapy of bacterial infections in the lungs of CF patients is in ensuring the continuity of treatment in light of the continual occurrence of resistant bacteria. This depends on the ability to rapidly select phages exhibiting an enhanced spectrum of lytic activity among several well-studied phage groups of proven safety. We propose a modular based approach, utilizing both mono-species and hetero-species phage mixtures. With an approach involving the visual recognition of characteristics exhibited by phages of well studied phage groups on lawns of the standard P. aeruginosa PAO1 strain, the simple and rapid enhancement of the lytic spectrum of cocktails is permitted, allowing the development of tailored preparations for patients capable of circumventing problems associated with phage resistant bacterial mutants.
Resumo:
L’augmentation de la croissance des réseaux, des blogs et des utilisateurs des sites d’examen sociaux font d’Internet une énorme source de données, en particulier sur la façon dont les gens pensent, sentent et agissent envers différentes questions. Ces jours-ci, les opinions des gens jouent un rôle important dans la politique, l’industrie, l’éducation, etc. Alors, les gouvernements, les grandes et petites industries, les instituts universitaires, les entreprises et les individus cherchent à étudier des techniques automatiques fin d’extraire les informations dont ils ont besoin dans les larges volumes de données. L’analyse des sentiments est une véritable réponse à ce besoin. Elle est une application de traitement du langage naturel et linguistique informatique qui se compose de techniques de pointe telles que l’apprentissage machine et les modèles de langue pour capturer les évaluations positives, négatives ou neutre, avec ou sans leur force, dans des texte brut. Dans ce mémoire, nous étudions une approche basée sur les cas pour l’analyse des sentiments au niveau des documents. Notre approche basée sur les cas génère un classificateur binaire qui utilise un ensemble de documents classifies, et cinq lexiques de sentiments différents pour extraire la polarité sur les scores correspondants aux commentaires. Puisque l’analyse des sentiments est en soi une tâche dépendante du domaine qui rend le travail difficile et coûteux, nous appliquons une approche «cross domain» en basant notre classificateur sur les six différents domaines au lieu de le limiter à un seul domaine. Pour améliorer la précision de la classification, nous ajoutons la détection de la négation comme une partie de notre algorithme. En outre, pour améliorer la performance de notre approche, quelques modifications innovantes sont appliquées. Il est intéressant de mentionner que notre approche ouvre la voie à nouveaux développements en ajoutant plus de lexiques de sentiment et ensembles de données à l’avenir.
Resumo:
Software protection is an essential aspect of information security to withstand malicious activities on software, and preserving software assets. However, software developers still lacks a methodology for the assessment of the deployed protections. To solve these issues, we present a novel attack simulation based software protection assessment method to assess and compare various protection solutions. Our solution relies on Petri Nets to specify and visualize attack models, and we developed a Monte Carlo based approach to simulate attacking processes and to deal with uncertainty. Then, based on this simulation and estimation, a novel protection comparison model is proposed to compare different protection solutions. Lastly, our attack simulation based software protection assessment method is presented. We illustrate our method by means of a software protection assessment process to demonstrate that our approach can provide a suitable software protection assessment for developers and software companies.
Resumo:
In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.
Resumo:
Model Driven based approach for Service Evolution in Clouds will mainly focus on the reusable evolution patterns' advantage to solve evolution problems. During the process, evolution pattern will be driven by MDA models to pattern aspects. Weaving the aspects into service based process by using Aspect-Oriented extended BPEL engine at runtime will be the dynamic feature of the evolution.
Resumo:
The emergence of multidrug-resistant bacterial infections in both the clinical setting and the community has created an environment in which the development of novel antibacterial compounds is necessary to keep dangerous infections at bay. While the derivatization of existing antibiotics by pharmaceutical companies has so far been successful at achieving this end, this strategy is short-term, and the discovery of antibacterials with novel scaffolds would be a greater contribution to the fight of multidrug-resistant infections. Described herein is the application of both target-based and whole cell screening strategies to identify novel antibacterial compounds. In a target-based approach, we sought small-molecule disruptors of the MazEF toxin-antitoxin protein complex. A lack of facile, continuous assays for this target required the development of a fluorometric assay for MazF ribonuclease activity. This assay was employed to further characterize the activity of the MazF enzyme and was used in a screening effort to identify disruptors of the MazEF complex. In addition, by employing a whole cell screening approach, we identified two compounds with potent antibacterial activity. Efforts to characterize the in vitro antibacterial activities displayed by these compounds and to identify their modes of action are described.
Resumo:
Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.
Resumo:
Nurse rostering is a complex scheduling problem that affects hospital personnel on a daily basis all over the world. This paper presents a new component-based approach with adaptive perturbations, for a nurse scheduling problem arising at a major UK hospital. The main idea behind this technique is to decompose a schedule into its components (i.e. the allocated shift pattern of each nurse), and then mimic a natural evolutionary process on these components to iteratively deliver better schedules. The worthiness of all components in the schedule has to be continuously demonstrated in order for them to remain there. This demonstration employs a dynamic evaluation function which evaluates how well each component contributes towards the final objective. Two perturbation steps are then applied: the first perturbation eliminates a number of components that are deemed not worthy to stay in the current schedule; the second perturbation may also throw out, with a low level of probability, some worthy components. The eliminated components are replenished with new ones using a set of constructive heuristics using local optimality criteria. Computational results using 52 data instances demonstrate the applicability of the proposed approach in solving real-world problems.