983 resultados para Performance Track
Resumo:
The attempts at carrying out terrorist attacks have become more prevalent. As a result, an increasing number of countries have become particularly vigilant against the means by which terrorists raise funds to finance their draconian acts against human life and property. Among the many counter-terrorism agencies in operation, governments have set up financial intelligence units (FIUs) within their borders for the purpose of tracking down terrorists’ funds. By investigating reported suspicious transactions, FIUs attempt to weed out financial criminals who use these illegal funds to finance terrorist activity. The prominent role played by FIUs means that their performance is always under the spotlight. By interviewing experts and conducting surveys of those associated with the fight against financial crime, this study investigated perceptions of FIU performance on a comparative basis between American and non-American FIUs. The target group of experts included financial institution personnel, civilian agents, law enforcement personnel, academicians, and consultants. Questions for the interview and surveys were based on the Kaplan and Norton’s Balanced Scorecard (BSC) methodology. One of the objectives of this study was to help determine the suitability of the BSC to this arena. While FIUs in this study have concentrated on performance by measuring outputs such as the number of suspicious transaction reports investigated, this study calls for a focus on outcomes involving all the parties responsible for financial criminal investigations. It is only through such an integrated approach that these various entities will be able to improve performance in solving financial crime. Experts in financial intelligence strongly believed that the quality and timeliness of intelligence was more important than keeping track of the number of suspicious transaction reports. Finally, this study concluded that the BSC could be appropriately applied to the arena of financial crime prevention even though the emphasis is markedly different from that in the private sector. While priority in the private sector is given to financial outcomes, in this arena employee growth and internal processes were perceived as most important in achieving a satisfactory outcome.
Resumo:
Executive functions (EF) such as self-monitoring, planning, and organizing are known to develop through childhood and adolescence. They are of potential importance for learning and school performance. Earlier research into the relation between EF and school performance did not provide clear results possibly because confounding factors such as educational track, boy-girl differences, and parental education were not taken into account. The present study therefore investigated the relation between executive function tests and school performance in a highly controlled sample of 173 healthy adolescents aged 12–18. Only students in the pre-university educational track were used and the performance of boys was compared to that of girls. Results showed that there was no relation between the report marks obtained and the performance on executive function tests, notably the Sorting Test and the Tower Test of the Delis-Kaplan Executive Functions System (D-KEFS). Likewise, no relation was found between the report marks and the scores on the Behavior Rating Inventory of Executive Function—Self-Report Version (BRIEF-SR) after these were controlled for grade, sex, and level of parental education. The findings indicate that executive functioning as measured with widely used instruments such as the BRIEF-SR does not predict school performance of adolescents in preuniversity education any better than a student's grade, sex, and level of parental education.
Resumo:
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
DUNE is a next-generation long-baseline neutrino oscillation experiment. It aims to measure the still unknown $ \delta_{CP} $ violation phase and the sign of $ \Delta m_{13}^2 $, which defines the neutrino mass ordering. DUNE will exploit a Far Detector composed of four multi-kiloton LArTPCs, and a Near Detector (ND) complex located close to the neutrino source at Fermilab. The SAND detector at the ND complex is designed to perform on-axis beam monitoring, constrain uncertainties in the oscillation analysis and perform precision neutrino physics measurements. SAND includes a 0.6 T super-conductive magnet, an electromagnetic calorimeter, a 1-ton liquid Argon detector - GRAIN - and a modular, low-density straw tube target tracker system. GRAIN is an innovative LAr detector where neutrino interactions can be reconstructed using only the LAr scintillation light imaged by an optical system based on Coded Aperture masks and lenses - a novel approach never used before in particle physics applications. In this thesis, a first evaluation of GRAIN track reconstruction and calorimetric capabilities was obtained with an optical system based on Coded Aperture cameras. A simulation of $\nu_\mu + Ar$ interactions with the energy spectrum expected at the future Fermilab Long Baseline Neutrino Facility (LBNF) was performed. The performance of SAND was evaluated, combining the information provided by all its sub-detectors, on the selection of $ \nu_\mu + Ar \to \mu^- + p + X $ sample and on the neutrino energy reconstruction.
Resumo:
This research activity aims at providing a reliable estimation of particular state variables or parameters concerning the dynamics and performance optimization of a MotoGP-class motorcycle, integrating the classical model-based approach with new methodologies involving artificial intelligence. The first topic of the research focuses on the estimation of the thermal behavior of the MotoGP carbon braking system. Numerical tools are developed to assess the instantaneous surface temperature distribution in the motorcycle's front brake discs. Within this application other important brake parameters are identified using Kalman filters, such as the disc convection coefficient and the power distribution in the disc-pads contact region. Subsequently, a physical model of the brake is built to estimate the instantaneous braking torque. However, the results obtained with this approach are highly limited by the knowledge of the friction coefficient (μ) between the disc rotor and the pads. Since the value of μ is a highly nonlinear function of many variables (namely temperature, pressure and angular velocity of the disc), an analytical model for the friction coefficient estimation appears impractical to establish. To overcome this challenge, an innovative hybrid solution is implemented, combining the benefit of artificial intelligence (AI) with classical model-based approach. Indeed, the disc temperature estimated through the thermal model previously implemented is processed by a machine learning algorithm that outputs the actual value of the friction coefficient thus improving the braking torque computation performed by the physical model of the brake. Finally, the last topic of this research activity regards the development of an AI algorithm to estimate the current sideslip angle of the motorcycle's front tire. While a single-track motorcycle kinematic model and IMU accelerometer signals theoretically enable sideslip calculation, the presence of accelerometer noise leads to a significant drift over time. To address this issue, a long short-term memory (LSTM) network is implemented.
Resumo:
Rail transportation has significant importance in the future world. This importance is tightly bounded to accessible, sustainable, efficient and safe railway systems. Precise positioning in railway applications is essential for increasing railway traffic, train-track control, collision avoidance, train management and autonomous train driving. Hence, precise train positioning is a safety-critical application. Nowadays, positioning in railway applications highly depends on a cellular-based system called GSM-R, a railway-specific version of Global System for Mobile Communications (GSM). However, GSM-R is a relatively outdated technology and does not provide enough capacity and precision demanded by future railway networks. One option for positioning is mounting Global Navigation Satellite System (GNSS) receivers on trains as a low-cost solution. Nevertheless, GNSS can not provide continuous service due to signal interruption by harsh environments, tunnels etc. Another option is exploiting cellular-based positioning methods. The most recent cellular technology, 5G, provides high network capacity, low latency, high accuracy and high availability suitable for train positioning. In this thesis, an approach to 5G-based positioning for railway systems is discussed and simulated. Observed Time Difference of Arrival (OTDOA) method and 5G Positioning Reference Signal (PRS) are used. Simulations run using MATLAB, based on existing code developed for 5G positioning by extending it for Non Line of Sight (NLOS) link detection and base station exclusion algorithms. Performance analysis for different configurations is completed. Results show that efficient NLOS detection improves positioning accuracy and implementing a base station exclusion algorithm helps for further increase.
Resumo:
The current dominance of African runners in long-distance running is an intriguing phenomenon that highlights the close relationship between genetics and physical performance. Many factors in the interesting interaction between genotype and phenotype (eg, high cardiorespiratory fitness, higher hemoglobin concentration, good metabolic efficiency, muscle fiber composition, enzyme profile, diet, altitude training, and psychological aspects) have been proposed in the attempt to explain the extraordinary success of these runners. Increasing evidence shows that genetics may be a determining factor in physical and athletic performance. But, could this also be true for African long-distance runners? Based on this question, this brief review proposed the role of genetic factors (mitochondrial deoxyribonucleic acid, the Y chromosome, and the angiotensin-converting enzyme and the alpha-actinin-3 genes) in the amazing athletic performance observed in African runners, especially the Kenyans and Ethiopians, despite their environmental constraints.
Resumo:
A rapid, sensitive and specific method for quantifying propylthiouracil in human plasma using methylthiouracil as the internal standard (IS) is described. The analyte and the IS were extracted from plasma by liquid-liquid extraction using an organic solvent (ethyl acetate). The extracts were analyzed by high performance liquid chromatography coupled with electrospray tandem mass spectrometry (HPLC-MS/MS) in negative mode (ES-). Chromatography was performed using a Phenomenex Gemini C18 5μm analytical column (4.6mm×150mm i.d.) and a mobile phase consisting of methanol/water/acetonitrile (40/40/20, v/v/v)+0.1% of formic acid. For propylthiouracil and I.S., the optimized parameters of the declustering potential, collision energy and collision exit potential were -60 (V), -26 (eV) and -5 (V), respectively. The method had a chromatographic run time of 2.5min and a linear calibration curve over the range 20-5000ng/mL. The limit of quantification was 20ng/mL. The stability tests indicated no significant degradation. This HPLC-MS/MS procedure was used to assess the bioequivalence of two propylthiouracil 100mg tablet formulations in healthy volunteers of both sexes in fasted and fed state. The geometric mean and 90% confidence interval CI of Test/Reference percent ratios were, without and with food, respectively: 109.28% (103.63-115.25%) and 115.60% (109.03-122.58%) for Cmax, 103.31% (100.74-105.96%) and 103.40% (101.03-105.84) for AUClast. This method offers advantages over those previously reported, in terms of both a simple liquid-liquid extraction without clean-up procedures, as well as a faster run time (2.5min). The LOQ of 20ng/mL is well suited for pharmacokinetic studies. The assay performance results indicate that the method is precise and accurate enough for the routine determination of the propylthiouracil in human plasma. The test formulation with and without food was bioequivalent to reference formulation. Food administration increased the Tmax and decreased the bioavailability (Cmax and AUC).
Resumo:
The aim of this study was to evaluate the performance of the Centers for Dental Specialties (CDS) in the country and associations with sociodemographic indicators of the municipalities, structural variables of services and primary health care organization in the years 2004-2009. The study used secondary data from procedures performed in the CDS to the specialties of periodontics, endodontics, surgery and primary care. Bivariate analysis by χ2 test was used to test the association between the dependent variable (performance of the CDS) with the independents. Then, Poisson regression analysis was performed. With regard to the overall achievement of targets, it was observed that the majority of CDS (69.25%) performance was considered poor/regular. The independent factors associated with poor/regular performance of CDS were: municipalities belonging to the Northeast, South and Southeast regions, with lower Human Development Index (HDI), lower population density, and reduced time to deployment. HDI and population density are important for the performance of the CDS in Brazil. Similarly, the peculiarities related to less populated areas as well as regional location and time of service implementation CDS should be taken into account in the planning of these services.
Resumo:
Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.
Resumo:
To examine the influence of l-arginine supplementation in combination with physical training on mitochondrial biomarkers from gastrocnemius muscle and its relationship with physical performance. Male Wistar rats were divided into four groups: control sedentary (SD), sedentary supplemented with l-arginine (SDLA), trained (TR) and trained supplemented with l-arginine (TRLA). Supplementation of l-arginine was administered by gavage (62.5mg/ml/day/rat). Physical training consisted of 60min/day, 5days/week, 0% grade, speed of 1.2km/h. The study lasted 8weeks. Skeletal muscle mitochondrial enriched fraction as well as cytoplasmic fractions were obtained for Western blotting and biochemical analyses. Protein expressions of transcriptor coactivator (PGC-1α), transcriptor factors (mtTFA), ATP synthase subunit c, cytochrome oxidase (COXIV), constitutive nitric oxide synthases (eNOS and nNOS), Cu/Zn-superoxide dismutase (SOD) and manganese-SOD (Mn-SOD) were evaluated. We also assessed in plasma: lipid profile, glycemia and malondialdehyde (MDA) levels. The nitrite/nitrate (NOx(-)) levels were measured in both plasma and cytosol fraction of the gastrocnemius muscle. 8-week l-arginine supplementation associated with physical training was effective in promoting greater tolerance to exercise that was accompanied by up-regulation of the protein expressions of mtTFA, PGC-1α, ATP synthase subunit c, COXIV, Cu/Zn-SOD and Mn-SOD. The upstream pathway was associated with improvement of NO bioavailability, but not in NO production since no changes in nNOS or eNOS protein expressions were observed. This combination would be an alternative approach for preventing cardiometabolic diseases given that in overt diseases a profound impairment in the physical performance of the patients is observed.
Resumo:
Objective Adapt the 6 minutes walking test (6MWT) to artificial gait in complete spinal cord injured (SCI) patients aided by neuromuscular electrical stimulation. Method Nine male individuals with paraplegia (AIS A) participated in this study. Lesion levels varied between T4 and T12 and time post injured from 4 to 13 years. Patients performed 6MWT 1 and 6MWT 2. They used neuromuscular electrical stimulation, and were aided by a walker. The differences between two 6MWT were assessed by using a paired t test. Multiple r-squared was also calculated. Results The 6MWT 1 and 6MWT 2 were not statistically different for heart rate, distance, mean speed and blood pressure. Multiple r-squared (r2 = 0.96) explained 96% of the variation in the distance walked. Conclusion The use of 6MWT in artificial gait towards assessing exercise walking capacity is reproducible and easy to apply. It can be used to assess SCI artificial gait clinical performance.
Resumo:
The pathological mechanisms underlying cognitive dysfunction in multiple sclerosis (MS) are not yet fully understood and, in addition to demyelinating lesions and gray-matter atrophy, subclinical disease activity may play a role. To evaluate the contribution of asymptomatic gadolinium-enhancing lesions to cognitive dysfunction along with gray-matter damage and callosal atrophy in relapsing-remitting MS (RRMS) patients. Forty-two treated RRMS and 30 controls were evaluated. MRI (3T) variables of interest were brain white-matter and cortical lesion load, cortical and deep gray-matter volumes, corpus callosum volume and presence of gadolinium-enhancing lesions. Outcome variables included EDSS, MS Functional Composite (MSFC) subtests and the Brief Repeatable Battery of Neuropsychological tests. Cognitive dysfunction was classified as deficits in two or more cognitive subtests. Multivariate regression analyses assessed the contribution of MRI metrics to outcomes. Patients with cognitive impairment (45.2%) had more cortical lesions and lower gray-matter and callosal volumes. Patients with subclinical MRI activity (15%) had worse cognitive performance. Clinical disability on MSFC was mainly associated with putaminal atrophy. The main independent predictors for cognitive deficits were high burden of cortical lesions and number of gadolinium-enhancing lesions. Cognitive dysfunction was especially related to high burden of cortical lesions and subclinical disease activity. Cognitive studies in MS should look over subclinical disease activity as a potential contributor to cognitive impairment.
Resumo:
cDNA arrays are a powerful tool for discovering gene expression patterns. Nylon arrays have the advantage that they can be re-used several times. A key issue in high throughput gene expression analysis is sensitivity. In the case of nylon arrays, signal detection can be affected by the plastic bags used to keep membranes humid. In this study, we evaluated the effect of five types of plastics on the radioactive transmittance, number of genes with a signal above the background, and data variability. A polyethylene plastic bag 69 μm thick had a strong shielding effect that blocked 68.7% of the radioactive signal. The shielding effect on transmittance decreased the number of detected genes and increased the data variability. Other plastics which were thinner gave better results. Although plastics made from polyvinylidene chloride, polyvinyl chloride (both 13 μm thick) and polyethylene (29 and 7 μm thick) showed different levels of transmittance, they all gave similarly good performances. Polyvinylidene chloride and polyethylene 29 mm thick were the plastics of choice because of their easy handling. For other types of plastics, it is advisable to run a simple check on their performance in order to obtain the maximum information from nylon cDNA arrays.