854 resultados para Residual-Based Panel Cointegration Test
Resumo:
Locating hands in sign language video is challenging due to a number of factors. Hand appearance varies widely across signers due to anthropometric variations and varying levels of signer proficiency. Video can be captured under varying illumination, camera resolutions, and levels of scene clutter, e.g., high-res video captured in a studio vs. low-res video gathered by a web cam in a user’s home. Moreover, the signers’ clothing varies, e.g., skin-toned clothing vs. contrasting clothing, short-sleeved vs. long-sleeved shirts, etc. In this work, the hand detection problem is addressed in an appearance matching framework. The Histogram of Oriented Gradient (HOG) based matching score function is reformulated to allow non-rigid alignment between pairs of images to account for hand shape variation. The resulting alignment score is used within a Support Vector Machine hand/not-hand classifier for hand detection. The new matching score function yields improved performance (in ROC area and hand detection rate) over the Vocabulary Guided Pyramid Match Kernel (VGPMK) and the traditional, rigid HOG distance on American Sign Language video gestured by expert signers. The proposed match score function is computationally less expensive (for training and testing), has fewer parameters and is less sensitive to parameter settings than VGPMK. The proposed detector works well on test sequences from an inexpert signer in a non-studio setting with cluttered background.
Resumo:
The performance of different classification approaches is evaluated using a view-based approach for motion representation. The view-based approach uses computer vision and image processing techniques to register and process the video sequence. Two motion representations called Motion Energy Images and Motion History Image are then constructed. These representations collapse the temporal component in a way that no explicit temporal analysis or sequence matching is needed. Statistical descriptions are then computed using moment-based features and dimensionality reduction techniques. For these tests, we used 7 Hu moments, which are invariant to scale and translation. Principal Components Analysis is used to reduce the dimensionality of this representation. The system is trained using different subjects performing a set of examples of every action to be recognized. Given these samples, K-nearest neighbor, Gaussian, and Gaussian mixture classifiers are used to recognize new actions. Experiments are conducted using instances of eight human actions (i.e., eight classes) performed by seven different subjects. Comparisons in the performance among these classifiers under different conditions are analyzed and reported. Our main goals are to test this dimensionality-reduced representation of actions, and more importantly to use this representation to compare the advantages of different classification approaches in this recognition task.
Resumo:
An improved technique for 3D head tracking under varying illumination conditions is proposed. The head is modeled as a texture mapped cylinder. Tracking is formulated as an image registration problem in the cylinder's texture map image. The resulting dynamic texture map provides a stabilized view of the face that can be used as input to many existing 2D techniques for face recognition, facial expressions analysis, lip reading, and eye tracking. To solve the registration problem in the presence of lighting variation and head motion, the residual error of registration is modeled as a linear combination of texture warping templates and orthogonal illumination templates. Fast and stable on-line tracking is achieved via regularized, weighted least squares minimization of the registration error. The regularization term tends to limit potential ambiguities that arise in the warping and illumination templates. It enables stable tracking over extended sequences. Tracking does not require a precise initial fit of the model; the system is initialized automatically using a simple 2D face detector. The only assumption is that the target is facing the camera in the first frame of the sequence. The formulation is tailored to take advantage of texture mapping hardware available in many workstations, PC's, and game consoles. The non-optimized implementation runs at about 15 frames per second on a SGI O2 graphic workstation. Extensive experiments evaluating the effectiveness of the formulation are reported. The sensitivity of the technique to illumination, regularization parameters, errors in the initial positioning and internal camera parameters are analyzed. Examples and applications of tracking are reported.
Resumo:
A method called "SymbolDesign" is proposed that can be used to design user-centered interfaces for pen-based input devices. It can also extend the functionality of pointer input devices such as the traditional computer mouse or the Camera Mouse, a camera-based computer interface. Users can create their own interfaces by choosing single-stroke movement patterns that are convenient to draw with the selected input device and by mapping them to a desired set of commands. A pattern could be the trace of a moving finger detected with the Camera Mouse or a symbol drawn with an optical pen. The core of the SymbolDesign system is a dynamically created classifier, in the current implementation an artificial neural network. The architecture of the neural network automatically adjusts according to the complexity of the classification task. In experiments, subjects used the SymbolDesign method to design and test the interfaces they created, for example, to browse the web. The experiments demonstrated good recognition accuracy and responsiveness of the user interfaces. The method provided an easily-designed and easily-used computer input mechanism for people without physical limitations, and, with some modifications, has the potential to become a computer access tool for people with severe paralysis.
Resumo:
Choosing the right or the best option is often a demanding and challenging task for the user (e.g., a customer in an online retailer) when there are many available alternatives. In fact, the user rarely knows which offering will provide the highest value. To reduce the complexity of the choice process, automated recommender systems generate personalized recommendations. These recommendations take into account the preferences collected from the user in an explicit (e.g., letting users express their opinion about items) or implicit (e.g., studying some behavioral features) way. Such systems are widespread; research indicates that they increase the customers' satisfaction and lead to higher sales. Preference handling is one of the core issues in the design of every recommender system. This kind of system often aims at guiding users in a personalized way to interesting or useful options in a large space of possible options. Therefore, it is important for them to catch and model the user's preferences as accurately as possible. In this thesis, we develop a comparative preference-based user model to represent the user's preferences in conversational recommender systems. This type of user model allows the recommender system to capture several preference nuances from the user's feedback. We show that, when applied to conversational recommender systems, the comparative preference-based model is able to guide the user towards the best option while the system is interacting with her. We empirically test and validate the suitability and the practical computational aspects of the comparative preference-based user model and the related preference relations by comparing them to a sum of weights-based user model and the related preference relations. Product configuration, scheduling a meeting and the construction of autonomous agents are among several artificial intelligence tasks that involve a process of constrained optimization, that is, optimization of behavior or options subject to given constraints with regards to a set of preferences. When solving a constrained optimization problem, pruning techniques, such as the branch and bound technique, point at directing the search towards the best assignments, thus allowing the bounding functions to prune more branches in the search tree. Several constrained optimization problems may exhibit dominance relations. These dominance relations can be particularly useful in constrained optimization problems as they can instigate new ways (rules) of pruning non optimal solutions. Such pruning methods can achieve dramatic reductions in the search space while looking for optimal solutions. A number of constrained optimization problems can model the user's preferences using the comparative preferences. In this thesis, we develop a set of pruning rules used in the branch and bound technique to efficiently solve this kind of optimization problem. More specifically, we show how to generate newly defined pruning rules from a dominance algorithm that refers to a set of comparative preferences. These rules include pruning approaches (and combinations of them) which can drastically prune the search space. They mainly reduce the number of (expensive) pairwise comparisons performed during the search while guiding constrained optimization algorithms to find optimal solutions. Our experimental results show that the pruning rules that we have developed and their different combinations have varying impact on the performance of the branch and bound technique.
Resumo:
The aim of this research, which focused on the Irish adult population, was to generate information for policymakers by applying statistical analyses and current technologies to oral health administrative and survey databases. Objectives included identifying socio-demographic influences on oral health and utilisation of dental services, comparing epidemiologically-estimated dental treatment need with treatment provided, and investigating the potential of a dental administrative database to provide information on utilisation of services and the volume and types of treatment provided over time. Information was extracted from the claims databases for the Dental Treatment Benefit Scheme (DTBS) for employed adults and the Dental Treatment Services Scheme (DTSS) for less-well-off adults, the National Surveys of Adult Oral Health, and the 2007 Survey of Lifestyle Attitudes and Nutrition in Ireland. Factors associated with utilisation and retention of natural teeth were analysed using count data models and logistic regression. The chi-square test and the student’s t-test were used to compare epidemiologically-estimated need in a representative sample of adults with treatment provided. Differences were found in dental care utilisation and tooth retention by Socio-Economic Status. An analysis of the five-year utilisation behaviour of a 2003 cohort of DTBS dental attendees revealed that age and being female were positively associated with visiting annually and number of treatments. Number of adults using the DTBS increased, and mean number of treatments per patient decreased, between 1997 and 2008. As a percentage of overall treatments, restorations, dentures, and extractions decreased, while prophylaxis increased. Differences were found between epidemiologically-estimated treatment need and treatment provided for those using the DTBS and DTSS. This research confirms the utility of survey and administrative data to generate knowledge for policymakers. Public administrative databases have not been designed for research purposes, but they have the potential to provide a wealth of knowledge on treatments provided and utilisation patterns.
Resumo:
The principal objective of this thesis was to investigate the ability of reversible optical O2 sensors to be incorporated into food/beverage packaging systems to continuously monitor O2 levels in a non-destructive manner immediately postpackaging and over time. Residual levels of O2 present in packs can negatively affect product quality and subsequently, product shelf-life, especially for O2-sensitive foods/beverages. Therefore, the ability of O2 sensors to continuously monitor O2 levels present within food/beverage packages was considered commercially relevant in terms of identifying the consequences of residual O2 on product safety and quality over time. Research commenced with the development of a novel range of O2 sensors based on phosphorescent platinum and palladium octaethylporphyrin-ketones (OEPk) in nano-porous high density polyethylene (HDPE), polypropylene (PP) polytetrafluoroethylene (PTFE) polymer supports. Sensors were calibrated over a temperature range of -10°C to +40°C and deemed suitable for food and beverage packaging applications. This sensor technology was used and demonstrated itself effective in determining failures in packaging containment. This was clearly demonstrated in the packaging of cheese string products. The sensor technology was also assessed across a wide range of packaged products; beer, ready-to-eat salad products, bread and convenience-style, muscle-based processed food products. The O2 sensor technology performed extremely well within all packaging systems. The sensor technology adequately detected O2 levels in; beer bottles prior to and following pasteurisation, modified atmosphere (MA) packs of ready-to-eat salad packs as respiration progressed during product storage and MA packs of bread and convenience-style muscle-based products as mycological growth occurred in food packs over time in the presence and absence of ethanol emitters. The use of the technology, in conjunction with standard food quality assessment techniques, showed remarkable usefulness in determining the impact of actual levels of O2 on specific quality attributes. The O2 sensing probe was modified, miniaturised and automated to screen for the determination of total aerobic viable counts (TVC) in several fish species samples. The test showed good correlation with conventional TVC test (ISO:4833:2003), analytical performance and ruggedness with respect to variation of key assay parameters (probe concentration and pipetting volume). Overall, the respirometric fish TVC test was simple to use, possessed a dynamic microbial range (104-107 cfu/g sample), had an accuracy of +/- one log(cfu/g sample) and was rapid. Its ability to assess highly perishable products such as fish for total microbial growth in <12 hr demonstrates commercial potential.
Resumo:
The concept of pellicular particles was suggested by Horváth and Lipsky over fifty years ago. The reasoning behind the idea of these particles was to improve column efficiency by shortening the pathways analyte molecules can travel, therefore reducing the effect of the A and C terms. Several types of shell particles were successfully marketed around this time, however with the introduction of high quality fully porous silica under 10 μm, shell particles faded into the background. In recent years a new generation of core shell particles have become popular within the separation science community. These particles allow fast and efficient separations that can be carried out on conventional HPLC systems. Chapter 1 of this thesis introduces the chemistry of chromatographic stationary phases, with an emphasis on silica bonded phases, particularly focusing on the current state of technology in this area. The main focus is on superficially porous silica particles as a support material for liquid chromatography. A summary of the history and development of these particles over the past few decades is explored, along with current methods of synthesis of shell particles. While commercial shell particles have a rough outer surface, Chapter 2 focuses on the novel approach to growth of smooth surface superficially porous particles in a step-by-step manner. From the Stöber methodology to the seeded growth technique, and finally to the layer-bylayer growth of the porous shell. The superficially porous particles generated in this work have an overall diameter of 2.6 μm with a 350 nm porous shell; these silica particles were characterised using SEM, TEM and BET analysis. The uniform spherical nature of the particles along with their surface area, pore size and particle size distribution are examined in this chapter. I discovered that these smooth surface shell particles can be synthesised to give comparable surface area and pore size in comparison to commercial brands. Chapter 3 deals with the bonding of the particles prepared in Chapter 2 with C18 functionality; one with a narrow and one with a wide particle size distribution. This chapter examines the chromatographic and kinetic performance of these silica stationary phases, and compares them to a commercial superficially porous silica phase with a rough outer surface. I found that the particle size distribution does not seem to be the major contributor to the improvement in efficiency. The surface morphology of the particles appears to play an important role in the packing process of these particles and influences the Van Deemter effects. Chapter 4 focuses on the functionalisation of 2.6 μm smooth surface superficially porous particles with a variety of fluorinated and phenyl silanes. The same processes were carried out on 3.0 μm fully porous silica particles to provide a comparison. All phases were accessed using elemental analysis, thermogravimetric analysis, nitrogen sorption analysis and chromatographically evaluated using the Neue test. I observed comparable results for the 2.6 μm shell pentaflurophenyl propyl silica when compared to 3.0 μm fully porous silica. Chapter 5 moves towards nano-particles, with the synthesis of sub-1 μm superficially porous particles, their characterisation and use in chromatography. The particles prepared are 750 nm in total with a 100 nm shell. All reactions and testing carried out on these 750 nm core shell particles are also carried out on 1.5 μm fully porous particles in order to give a comparative result. The 750 nm core shell particles can be synthesised quickly and are very uniform. The main drawback in their use for HPLC is the system itself due to the backpressure experienced using sub – 1 μm particles. The synthesis of modified Stöber particles is also examined in this chapter with a range of non-porous silica and shell silica from 70 nm – 750 nm being tested for use on a Langmuir – Blodgett system. These smooth surface shell particles have only been in existence since 2009. The results displayed in this thesis demonstrate how much potential smooth surface shell particles have provided more in-depth optimisation is carried out. The results on packing studies reported in this thesis aims to be a starting point for a more sophisticated methodology, which in turn can lead to greater chromatographic improvements.
Resumo:
Background: Many European countries including Ireland lack high quality, on-going, population based estimates of maternal behaviours and experiences during pregnancy. PRAMS is a CDC surveillance program which was established in the United States in 1987 to generate high quality, population based data to reduce infant mortality rates and improve maternal and infant health. PRAMS is the only on-going population based surveillance system of maternal behaviours and experiences that occur before, during and after pregnancy worldwide.Methods: The objective of this study was to adapt, test and evaluate a modified CDC PRAMS methodology in Ireland. The birth certificate file which is the standard approach to sampling for PRAMS in the United States was not available for the PRAMS Ireland study. Consequently, delivery record books for the period between 3 and 5 months before the study start date at a large urban obstetric hospital [8,900 births per year] were used to randomly sample 124 women. Name, address, maternal age, infant sex, gestational age at delivery, delivery method, APGAR score and birth weight were manually extracted from records. Stillbirths and early neonatal deaths were excluded using APGAR scores and hospital records. Women were sent a letter of invitation to participate including option to opt out, followed by a modified PRAMS survey, a reminder letter and a final survey.Results: The response rate for the pilot was 67%. Two per cent of women refused the survey, 7% opted out of the study and 24% did not respond. Survey items were at least 88% complete for all 82 respondents. Prevalence estimates of socially undesirable behaviours such as alcohol consumption during pregnancy were high [>50%] and comparable with international estimates.Conclusion: PRAMS is a feasible and valid method of collecting information on maternal experiences and behaviours during pregnancy in Ireland. PRAMS may offer a potential solution to data deficits in maternal health behaviour indicators in Ireland with further work. This study is important to researchers in Europe and elsewhere who may be interested in new ways of tailoring an established CDC methodology to their unique settings to resolve data deficits in maternal health.
Resumo:
In order to widely use Ge and III-V materials instead of Si in advanced CMOS technology, the process and integration of these materials has to be well established so that their high mobility benefit is not swamped by imperfect manufacturing procedures. In this dissertation number of key bottlenecks in realization of Ge devices are investigated; We address the challenge of the formation of low resistivity contacts on n-type Ge, comparing conventional and advanced rapid thermal annealing (RTA) and laser thermal annealing (LTA) techniques respectively. LTA appears to be a feasible approach for realization of low resistivity contacts with an incredibly sharp germanide-substrate interface and contact resistivity in the order of 10 -7 Ω.cm2. Furthermore the influence of RTA and LTA on dopant activation and leakage current suppression in n+/p Ge junction were compared. Providing very high active carrier concentration > 1020 cm-3, LTA resulted in higher leakage current compared to RTA which provided lower carrier concentration ~1019 cm-3. This is an indication of a trade-off between high activation level and junction leakage current. High ION/IOFF ratio ~ 107 was obtained, which to the best of our knowledge is the best reported value for n-type Ge so far. Simulations were carried out to investigate how target sputtering, dose retention, and damage formation is generated in thin-body semiconductors by means of energetic ion impacts and how they are dependent on the target physical material properties. Solid phase epitaxy studies in wide and thin Ge fins confirmed the formation of twin boundary defects and random nucleation growth, like in Si, but here 600 °C annealing temperature was found to be effective to reduce these defects. Finally, a non-destructive doping technique was successfully implemented to dope Ge nanowires, where nanowire resistivity was reduced by 5 orders of magnitude using PH3 based in-diffusion process.
Resumo:
The generation of recombinant antibodies (Abs) using phage display is a proven method to obtain a large variety of Abs that bind with high affinity to a given antigen. Traditionally, the generation of single-chain Abs depends on the use of recombinant proteins in several stages of the procedure. This can be a problem, especially in the case of cell-surface receptors, because Abs generated and selected against recombinant proteins may not bind the same protein expressed on a cell surface in its native form and because the expression of some receptors as recombinant proteins is problematic. To overcome these difficulties, we developed a strategy to generate single-chain Abs that does not require the use of recombinant protein at any stage of the procedure. In this strategy, stably transfected cells are used for the immunization of mice, measuring Ab responses to immunization, panning the phage library, high-throughput screening of arrayed phage clones, and characterization of recombinant single-chain variable regions. This strategy was used to generate a panel of single-chain Abs specific for the innate immunity receptor Toll-like receptor 2. Once generated, individual single-chain variable regions were subcloned into an expression vector allowing the production of recombinant Abs in insect cells, thus avoiding the contamination of recombinant Abs with microbial products. This cell-based system efficiently generates Abs that bind to native molecules on the cell surface, bypasses the requirement of recombinant protein production, and avoids risks of microbial component contamination.
Resumo:
There is a general presumption in the literature and among policymakers that immigrant remittances play the same role in economic development as foreign direct investment and other capital flows, but this is an open question. We develop a model of remittances based on the economics of the family that implies that remittances are not profit-driven, but are compensatory transfers, and should have a negative correlation with GDP growth. This is in contrast to the positive correlation of profit-driven capital flows with GDP growth. We test this implication of our model using a new panel data set on remittances and find a robust negative correlation between remittances and GDP growth. This indicates that remittances may not be intended to serve as a source of capital for economic development. © 2005 International Monetary Fund.
Resumo:
Does environmental regulation impair international competitiveness of pollution-intensive industries to the extent that they relocate to countries with less stringent regulation, turning those countries into "pollution havens"? We test this hypothesis using panel data on outward foreign direct investment (FDI) flows of various industries in the German manufacturing sector and account for several econometric issues that have been ignored in previous studies. Most importantly, we demonstrate that externalities associated with FDI agglomeration can bias estimates away from finding a pollution haven effect if omitted from the analysis. We include the stock of inward FDI as a proxy for agglomeration and employ a GMM estimator to control for endogenous time-varying determinants of FDI flows. Furthermore, we propose a difference estimator based on the least polluting industry to break the possible correlation between environmental regulatory stringency and unobservable attributes of FDI recipients in the cross-section. When accounting for these issues we find robust evidence of a pollution haven effect for the chemical industry. © 2008 Springer Science+Business Media B.V.
Resumo:
In recent years, the storage and use of residual newborn screening (NBS) samples has gained attention. To inform ongoing policy discussions, this article provides an update of previous work on new policies, educational materials, and parental options regarding the storage and use of residual NBS samples. A review of state NBS Web sites was conducted for information related to the storage and use of residual NBS samples in January 2010. In addition, a review of current statutes and bills introduced between 2005 and 2009 regarding storage and/or use of residual NBS samples was conducted. Fourteen states currently provide information about the storage and/or use of residual NBS samples. Nine states provide parents the option to request destruction of the residual NBS sample after the required storage period or the option to exclude the sample for research uses. In the coming years, it is anticipated that more states will consider policies to address parental concerns about the storage and use of residual NBS samples. Development of new policies regarding storage and use of residual NBS samples will require careful consideration of impact on NBS programs, parent and provider educational materials, and respect for parents among other issues.
Resumo:
Described here is a mass spectrometry-based screening assay for the detection of protein-ligand binding interactions in multicomponent protein mixtures. The assay utilizes an oxidation labeling protocol that involves using hydrogen peroxide to selectively oxidize methionine residues in proteins in order to probe the solvent accessibility of these residues as a function of temperature. The extent to which methionine residues in a protein are oxidized after specified reaction times at a range of temperatures is determined in a MALDI analysis of the intact proteins and/or an LC-MS analysis of tryptic peptide fragments generated after the oxidation reaction is quenched. Ultimately, the mass spectral data is used to construct thermal denaturation curves for the detected proteins. In this proof-of-principle work, the protocol is applied to a four-protein model mixture comprised of ubiquitin, ribonuclease A (RNaseA), cyclophilin A (CypA), and bovine carbonic anhydrase II (BCAII). The new protocol's ability to detect protein-ligand binding interactions by comparing thermal denaturation data obtained in the absence and in the presence of ligand is demonstrated using cyclosporin A (CsA) as a test ligand. The known binding interaction between CsA and CypA was detected using both the MALDI- and LC-MS-based readouts described here.