964 resultados para contract-based guidance
Resumo:
Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.
Resumo:
Purpose: This study involved an extensive search for randomized controlled clinical trials comparing bilateral balanced and canine-guided dentures, and questioned whether a bilateral balanced occlusion is imperative for successful denture treatment. Materials and Methods: Studies were identified by searching electronic databases (PubMed/MEDLINE, ISI Web of Science, LILACS, and BBD). The keywords “denture” and “occlusion” were used. The minimum inclusion requirements were (1) randomized controlled trials with patients of any age wearing both maxillary and mandibular conventional complete dentures (CDs), (2) comparison between bilateral balanced and canine-guided dentures, and (3) assessment of masticatory function and/or patients’ satisfaction. Results: The search resulted in the identification of 5166 articles. Subsequently, 5156 articles were excluded on the basis of title and abstract. By the end of the search phase, seven randomized controlled trials were considered eligible. Conclusions: Current scientific evidence suggests that bilateral balanced occlusion is not imperative for successful treatment with conventional CDs in average patients. More studies are necessary to identify if specific clinical conditions may benefit from a balanced occlusion.
Resumo:
BACKGROUND: Guidance for appropriate utilisation of transthoracic echocardiograms (TTEs) can be incorporated into ordering prompts, potentially affecting the number of requests. METHODS: We incorporated data from the 2011 Appropriate Use Criteria for Echocardiography, the 2010 National Institute for Clinical Excellence Guideline on Chronic Heart Failure, and American College of Cardiology Choosing Wisely list on TTE use for dyspnoea, oedema and valvular disease into electronic ordering systems at Durham Veterans Affairs Medical Center. Our primary outcome was TTE orders per month. Secondary outcomes included rates of outpatient TTE ordering per 100 visits and frequency of brain natriuretic peptide (BNP) ordering prior to TTE. Outcomes were measured for 20 months before and 12 months after the intervention. RESULTS: The number of TTEs ordered did not decrease (338±32 TTEs/month prior vs 320±33 afterwards, p=0.12). Rates of outpatient TTE ordering decreased minimally post intervention (2.28 per 100 primary care/cardiology visits prior vs 1.99 afterwards, p<0.01). Effects on TTE ordering and ordering rate significantly interacted with time from intervention (p<0.02 for both), as the small initial effects waned after 6 months. The percentage of TTE orders with preceding BNP increased (36.5% prior vs 42.2% after for inpatients, p=0.01; 10.8% prior vs 14.5% after for outpatients, p<0.01). CONCLUSIONS: Ordering prompts for TTEs initially minimally reduced the number of TTEs ordered and increased BNP measurement at a single institution, but the effect on TTEs ordered was likely insignificant from a utilisation standpoint and decayed over time.
Resumo:
This book brings together experts in the fields of spatial planning, landuse and infrastructure management to explore the emerging agenda of spatially-oriented integrated evaluation. It weaves together the latest theories, case studies, methods, policy and practice to examine and assess the values, impacts, benefits and the overall success in integrated land-use management. In doing so, the book clarifies the nature and roles of evaluation and puts forward guidance for future policy and practice.
Resumo:
This book brings together experts in the fields of spatial planning, landuse and infrastructure management to explore the emerging agenda of spatially-oriented integrated evaluation. It weaves together the latest theories, case studies, methods, policy and practice to examine and assess the values, impacts, benefits and the overall success in integrated land-use management. In doing so, the book clarifies the nature and roles of evaluation and puts forward guidance for future policy and practice.
Resumo:
Objective
Pedestrian detection under video surveillance systems has always been a hot topic in computer vision research. These systems are widely used in train stations, airports, large commercial plazas, and other public places. However, pedestrian detection remains difficult because of complex backgrounds. Given its development in recent years, the visual attention mechanism has attracted increasing attention in object detection and tracking research, and previous studies have achieved substantial progress and breakthroughs. We propose a novel pedestrian detection method based on the semantic features under the visual attention mechanism.
Method
The proposed semantic feature-based visual attention model is a spatial-temporal model that consists of two parts: the static visual attention model and the motion visual attention model. The static visual attention model in the spatial domain is constructed by combining bottom-up with top-down attention guidance. Based on the characteristics of pedestrians, the bottom-up visual attention model of Itti is improved by intensifying the orientation vectors of elementary visual features to make the visual saliency map suitable for pedestrian detection. In terms of pedestrian attributes, skin color is selected as a semantic feature for pedestrian detection. The regional and Gaussian models are adopted to construct the skin color model. Skin feature-based visual attention guidance is then proposed to complete the top-down process. The bottom-up and top-down visual attentions are linearly combined using the proper weights obtained from experiments to construct the static visual attention model in the spatial domain. The spatial-temporal visual attention model is then constructed via the motion features in the temporal domain. Based on the static visual attention model in the spatial domain, the frame difference method is combined with optical flowing to detect motion vectors. Filtering is applied to process the field of motion vectors. The saliency of motion vectors can be evaluated via motion entropy to make the selected motion feature more suitable for the spatial-temporal visual attention model.
Result
Standard datasets and practical videos are selected for the experiments. The experiments are performed on a MATLAB R2012a platform. The experimental results show that our spatial-temporal visual attention model demonstrates favorable robustness under various scenes, including indoor train station surveillance videos and outdoor scenes with swaying leaves. Our proposed model outperforms the visual attention model of Itti, the graph-based visual saliency model, the phase spectrum of quaternion Fourier transform model, and the motion channel model of Liu in terms of pedestrian detection. The proposed model achieves a 93% accuracy rate on the test video.
Conclusion
This paper proposes a novel pedestrian method based on the visual attention mechanism. A spatial-temporal visual attention model that uses low-level and semantic features is proposed to calculate the saliency map. Based on this model, the pedestrian targets can be detected through focus of attention shifts. The experimental results verify the effectiveness of the proposed attention model for detecting pedestrians.
Resumo:
Abstract : Many individuals that had a stroke have motor impairments such as timing deficits that hinder their ability to complete daily activities like getting dressed. Robotic rehabilitation is an increasingly popular therapeutic avenue in order to improve motor recovery among this population. Yet, most studies have focused on improving the spatial aspect of movement (e.g. reaching), and not the temporal one (e.g. timing). Hence, the main aim of this study was to compare two types of robotic rehabilitation on the immediate improvement of timing accuracy: haptic guidance (HG), which consists of guiding the person to make the correct movement, and thus decreasing his or her movement errors, and error amplification (EA), which consists of increasing the person’s movement errors. The secondary objective consisted of exploring whether the side of the stroke lesion had an effect on timing accuracy following HG and EA training. Thirty-four persons that had a stroke (average age 67 ± 7 years) participated in a single training session of a timing-based task (simulated pinball-like task), where they had to activate a robot at the correct moment to successfully hit targets that were presented a random on a computer screen. Participants were randomly divided into two groups, receiving either HG or EA. During the same session, a baseline phase and a retention phase were given before and after each training, and these phases were compared in order to evaluate and compare the immediate impact of HG and EA on movement timing accuracy. The results showed that HG helped improve the immediate timing accuracy (p=0.03), but not EA (p=0.45). After comparing both trainings, HG was revealed to be superior to EA at improving timing (p=0.04). Furthermore, a significant correlation was found between the side of stroke lesion and the change in timing accuracy following EA (r[subscript pb]=0.7, p=0.001), but not HG (r[subscript pb]=0.18, p=0.24). In other words, a deterioration in timing accuracy was found for participants with a lesion in the left hemisphere that had trained with EA. On the other hand, for the participants having a right-sided stroke lesion, an improvement in timing accuracy was noted following EA. In sum, it seems that HG helps improve the immediate timing accuracy for individuals that had a stroke. Still, the side of the stroke lesion seems to play a part in the participants’ response to training. This remains to be further explored, in addition to the impact of providing more training sessions in order to assess any long-term benefits of HG or EA.
A Comparative Study on the Homeroom Teachers’ Perception of the School guidance in Korea and Finland
Resumo:
This study has four major purposes. First, it compares school guidance of homeroom teachers in Korea and Finland, in order to understand the reality of education, based on the teachers’ perceptions. Secondly, it also considers the topic within its historical, social, and cultural backgrounds, from a critical standpoint. Thirdly, it investigates the direction of the improvement of school guidance, based on the analysis of similarities and differences between Korea and Finland, with regards to the meaning, practice, and environmental factors of the school guidance. Lastly, the influential factors surrounding the school guidance are noted by analysing empirical data from a microscopic approach, and extending the understanding of it into a social context. As for the methods, it employs thematic analysis approach through 10 homeroom teacher interviews in the lower secondary schools. As a result, firstly, the teachers in both countries assumed similarly, that the role of the teacher was not only to teach the subject, but also to care about every aspects of the students’ development in their school life. In addition, they accepted the fact that school guidance became more significant. However, the school guidance became the top priority for the Korean teachers, while teaching subject is the main task for the Finnish teachers. Secondly, the homeroom teachers in both countries hoped to have a better working environment, to perform school guidance concerning education budget for the resources of school guidance, tight curriculum, and increasing the teachers’ tasks. Thirdly, the school guidance in Korea seemed to be influenced by social expectation and government demand, whereas, the Finnish teachers considered school guidance in more aspects of adjustment and academic motivation, rather than resolving the social problems. Fourthly, the Korean teachers perceived that the trust and respect from the society and home became weakened, also expressing doubts about the educational policies and the attitude of the government with regards to school guidance. On the other hand, the Finnish teachers believed that they were trusted and respected by the society. However, blurred lines in the roles and accountability between the homeroom teachers, home, and the society were also controversial among the teachers in both countries. To sum up, Finland needs to ameliorate the system and conditions for school guidance of the homeroom teachers. The consensus on the role and tasks of Finnish homeroom teachers for school guidance seem to be also necessary. Meanwhile, Korea should improve the social system and social consciousness of the teacher, school guidance, and schooling, preceding the reform of the education system or conditions.
Resumo:
Worldwide air traffic tends to increase and for many airports it is no longer an op-tion to expand terminals and runways, so airports are trying to maximize their op-erational efficiency. Many airports already operate near their maximal capacity. Peak hours imply operational bottlenecks and cause chained delays across flights impacting passengers, airlines and airports. Therefore there is a need for the opti-mization of the ground movements at the airports. The ground movement prob-lem consists of routing the departing planes from the gate to the runway for take-off, and the arriving planes from the runway to the gate, and to schedule their movements. The main goal is to minimize the time spent by the planes during their ground movements while respecting all the rules established by the Ad-vanced Surface Movement, Guidance and Control Systems of the International Civil Aviation. Each aircraft event (arrival or departing authorization) generates a new environment and therefore a new instance of the Ground Movement Prob-lem. The optimization approach proposed is based on an Iterated Local Search and provides a fast heuristic solution for each real-time event generated instance granting all safety regulations. Preliminary computational results are reported for real data comparing the heuristic solutions with the solutions obtained using a mixed-integer programming approach.
Resumo:
When a task must be executed in a remote or dangerous environment, teleoperation systems may be employed to extend the influence of the human operator. In the case of manipulation tasks, haptic feedback of the forces experienced by the remote (slave) system is often highly useful in improving an operator's ability to perform effectively. In many of these cases (especially teleoperation over the internet and ground-to-space teleoperation), substantial communication latency exists in the control loop and has the strong tendency to cause instability of the system. The first viable solution to this problem in the literature was based on a scattering/wave transformation from transmission line theory. This wave transformation requires the designer to select a wave impedance parameter appropriate to the teleoperation system. It is widely recognized that a small value of wave impedance is well suited to free motion and a large value is preferable for contact tasks. Beyond this basic observation, however, very little guidance exists in the literature regarding the selection of an appropriate value. Moreover, prior research on impedance selection generally fails to account for the fact that in any realistic contact task there will simultaneously exist contact considerations (perpendicular to the surface of contact) and quasi-free-motion considerations (parallel to the surface of contact). The primary contribution of the present work is to introduce an approximate linearized optimum for the choice of wave impedance and to apply this quasi-optimal choice to the Cartesian reality of such a contact task, in which it cannot be expected that a given joint will be either perfectly normal to or perfectly parallel to the motion constraint. The proposed scheme selects a wave impedance matrix that is appropriate to the conditions encountered by the manipulator. This choice may be implemented as a static wave impedance value or as a time-varying choice updated according to the instantaneous conditions encountered. A Lyapunov-like analysis is presented demonstrating that time variation in wave impedance will not violate the passivity of the system. Experimental trials, both in simulation and on a haptic feedback device, are presented validating the technique. Consideration is also given to the case of an uncertain environment, in which an a priori impedance choice may not be possible.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.
Resumo:
Interest in Mg foams is increasing due to their potential use as biomaterials. Fabrication methods determine to a great extent their structure and, in some cases, may pollute the foam. In this work Mg foams are fabricated by a replica method that uses as skeleton packed spheres of active carbon, a material widely utilized in medicine. After Mg infiltration, carbon particles are eliminated by an oxidizing heat treatment. The latter covers Mg with MgO which improves performance. In particular, oxidation retards degradation of the foam, as the polarization curves of the Mg foam with and without oxide indicate. The sphericity and regularity of C particles allows control of the structure of the produced open-cell foams.
Resumo:
Cities are small-scale complex socio-ecological systems, that host around 60% of world population. Ecosystem Services (ES) provided by urban ecosystems offer multiple benefits necessary to cope with present and future urban challenges. These ES include microclimate regulation, runoff control, as well as opportunities for mental and physical recreation, affecting citizen’s health and wellbeing. Creating a balance between urban development, land take containment, climate adaptation and availability of Urban Green Areas and their related benefits, can improve the quality of the lives of the inhabitants, the economic performance of the city and the social justice and cohesion aspects. This work starts analysing current literature around the topic of Ecosystem Services (ES), Green and Blue Infrastructure (GBI) and Nature-based Solutions (NBS) and their integration within current European and International sustainability policies. Then, the thesis focuses on the role of ES, GBI and NBS towards urban sustainability and resilience setting the basis to build the core methodological and conceptual approach of this work. The developed ES-based conceptual approach provides guidance on how to map and assess ES, to better inform policy making and to give the proper value to ES within urban context. The proposed interdisciplinary approach navigates the topic of mapping and assessing ES benefits in terms of regulatory services, with a focus on climate mitigation and adaptation, and cultural services, to enhance wellbeing and justice in urban areas. Last, this thesis proposes a trans-disciplinary and participatory approach to build resilience over time around all relevant urban ES. The two case studies that will be presented in this dissertation, the city of Bologna and the city of Barcelona, have been used to implement, tailor and test the proposed conceptual framework, raising valuable inputs for planning, policies and science.
Resumo:
Recently, the JPL's MarCO mission demonstrated that these probes are also mature enough to be employed in the deep space, even though with the limitations related to the employed commercial components. Currently, other deep space CubeSats are planned either as stand-alone missions or as companions of a traditional large probe. Therefore, developing a dedicated navigation suite is crucial to reaching the mission's goals, considering the limitations of the onboard components compared to typical deep space missions. In this framework, the LICIACube mission represents an ideal candidate test-bench, as it performs a flyby of the Didymos asteroid system subject to a strong position, epochs, and pointing requirements. This mission will also allow us to infer the capabilities of such microsatellites and highlight their limitations compared with the benefits of a lighter design and tailoring efforts. In this work, the OD and guidance methods and tools adopted for classical deep space missions have been tailored for the CubeSat applications and validated through extensive analyses. In addition, navigation procedures and interfaces have been designed in view of the operations foreseen in late 2022. The pre-launch covariance analysis has been performed to assess the mission's feasibility for the nominal trajectory and its associated uncertainties, based on conservative assumptions on the main parameters. Extensive sensitivity analyses have been carried out to understand the main mission parameters affecting the performance and to demonstrate the robustness of the designed trajectory and operation schedule in fulfilling the mission requirements. The developed system was also stressed by tuning the models to access different reconstruction methods for the maneuvers. The analysis demonstrated the feasibility of the LICIACube mission navigation in compliance with the mission requirements, compatible with the limited resources available, both in space and on the ground.
Resumo:
In this thesis, the problem of controlling a quadrotor UAV is considered. It is done by presenting an original control system, designed as a combination of Neural Networks and Disturbance Observer, using a composite learning approach for a system of the second order, which is a novel methodology in literature. After a brief introduction about the quadrotors, the concepts needed to understand the controller are presented, such as the main notions of advanced control, the basic structure and design of a Neural Network, the modeling of a quadrotor and its dynamics. The full simulator, developed on the MATLAB Simulink environment, used throughout the whole thesis, is also shown. For the guidance and control purposes, a Sliding Mode Controller, used as a reference, it is firstly introduced, and its theory and implementation on the simulator are illustrated. Finally the original controller is introduced, through its novel formulation, and implementation on the model. The effectiveness and robustness of the two controllers are then proven by extensive simulations in all different conditions of external disturbance and faults.