962 resultados para dust proof
Resumo:
Quantitative methods can help us understand how underlying attributes contribute to movement patterns. Applying principal components analysis (PCA) to whole-body motion data may provide an objective data-driven method to identify unique and statistically important movement patterns. Therefore, the primary purpose of this study was to determine if athletes’ movement patterns can be differentiated based on skill level or sport played using PCA. Motion capture data from 542 athletes performing three sport-screening movements (i.e. bird-dog, drop jump, T-balance) were analyzed. A PCA-based pattern recognition technique was used to analyze the data. Prior to analyzing the effects of skill level or sport on movement patterns, methodological considerations related to motion analysis reference coordinate system were assessed. All analyses were addressed as case-studies. For the first case study, referencing motion data to a global (lab-based) coordinate system compared to a local (segment-based) coordinate system affected the ability to interpret important movement features. Furthermore, for the second case study, where the interpretability of PCs was assessed when data were referenced to a stationary versus a moving segment-based coordinate system, PCs were more interpretable when data were referenced to a stationary coordinate system for both the bird-dog and T-balance task. As a result of the findings from case study 1 and 2, only stationary segment-based coordinate systems were used in cases 3 and 4. During the bird-dog task, elite athletes had significantly lower scores compared to recreational athletes for principal component (PC) 1. For the T-balance movement, elite athletes had significantly lower scores compared to recreational athletes for PC 2. In both analyses the lower scores in elite athletes represented a greater range of motion. Finally, case study 4 reported differences in athletes’ movement patterns who competed in different sports, and significant differences in technique were detected during the bird-dog task. Through these case studies, this thesis highlights the feasibility of applying PCA as a movement pattern recognition technique in athletes. Future research can build on this proof-of-principle work to develop robust quantitative methods to help us better understand how underlying attributes (e.g. height, sex, ability, injury history, training type) contribute to performance.
Resumo:
Background: Women with germline BRCA1 mutations have a high lifetime risk of breast cancer, with the only available risk-reduction strategies being risk-reducing surgery or chemoprevention. These women predominantly develop triple-negative breast cancers; hence, it is unlikely that selective estrogen receptor modulators (serms) will reduce the risk of developing cancer, as these have not been shown to reduce the incidence of estrogen receptor–negative breast cancers. Preclinical data from our laboratory suggest that exposure to estrogen and its metabolites is capable of causing dna double-strand breaks (dsbs) and thus driving genomic instability, an early hallmark of BRCA1-related breast cancer. Therefore, an approach that lowers circulating estrogen levels and reduces estrogen metabolite exposure may prove a successful chemopreventive strategy.
Aims: To provide proof of concept of the hypothesis that the combination of luteinizing-hormone releasing-hormone agonists (lhrha) and aromatase inhibitors (ais) can suppress circulating levels of estrogen and its metabolites in BRCA1 mutation carriers, thus reducing estrogen metabolite levels in breast cells, reducing dna dsbs, and potentially reducing the incidence of breast cancer.
Methods: 12 Premenopausal BRCA1 mutation carriers will undergo baseline ultrasound-guided breast core biopsy and plasma and urine sampling. Half the women will be treated for 3 months with combination goserelin (lhrha) plus anastrazole (ai), and the remainder with tamoxifen (serm) before repeat tissue, plasma, and urine sampling. After a 1-month washout period, groups will cross over for a further 3 months treatment before final biologic sample collection. Tissue, plasma, and urine samples will be examined using a combination of immunohistochemistry, comet assays, and ultrahigh performance liquid chromatography tandem mass spectrometry to assess the impact of lhrha plus ai compared with serm on levels of dna damage, estrogens, and genotoxic estrogen metabolites. Quality of life will also be assessed during the study.
Results: This trial is currently ongoing.
Resumo:
For an arbitrary associative unital ring RR, let J1J1 and J2J2 be the following noncommutative, birational, partly defined involutions on the set M3(R)M3(R) of 3×33×3 matrices over RR: J1(M)=M−1J1(M)=M−1 (the usual matrix inverse) and J2(M)jk=(Mkj)−1J2(M)jk=(Mkj)−1 (the transpose of the Hadamard inverse).
We prove the surprising conjecture by Kontsevich that (J2∘J1)3(J2∘J1)3 is the identity map modulo the DiagL×DiagRDiagL×DiagR action (D1,D2)(M)=D−11MD2(D1,D2)(M)=D1−1MD2 of pairs of invertible diagonal matrices. That is, we show that, for each MM in the domain where (J2∘J1)3(J2∘J1)3 is defined, there are invertible diagonal 3×33×3 matrices D1=D1(M)D1=D1(M) and D2=D2(M)D2=D2(M) such that (J2∘J1)3(M)=D−11MD2(J2∘J1)3(M)=D1−1MD2.
Resumo:
Research activities during this period concentrated on continuation of field and laboratory testing for the Dallas County test road. Stationary ditch collection of dust was eliminated because of inconsistent data, and because of vandalism to collectors. Braking tests were developed and initiated to evaluate the influence of treatments on braking and safety characteristics of the test sections. Dust testing was initiated for out of the wheelpath conditions as well as in the wheelpath. Contrary to the results obtained during the summer and fall of 1987, the 1.5 percent bentonite treatment appears to be outperforming the other bentonite treated sections after over a year of service. Overall dust reduction appears to average between 25 to 35 percent. Dallas County applied 300 tons per mile of class A roadstone maintenance surfacing to the test road in August 1988. Test data indicates that the bentonite is capable of interacting and functioning to reduce dust generation of the new surfacing material. Again, the 1.5 percent bentonite treatment appeared the most effective. The fine particulate bonding and aggregation mechanism of the bentonite appears recoverable from the environmental effects of winter, and from alternating wet and dry road surface conditions. The magnesium chloride treatment appears capable of long-term (over one year) dust reduction and exhibited an overall average reduction in the range of 15 to 30 percent. The magnesium chloride treatment also appears capable of interacting with newly applied crushed stone to reduce dust generation. Two additional one mile test roads were to have been constructed early this year. Due to an extremely dry spring and summer, construction scheduling was not possible until August. This would have allowed only minimal data collection. Considering this and the fact that this was an atypically dry summer, it was our opinion that it would be in the best interest of the research project to extend the project (at no additional cost) for a period of one year. The two additional test roads will be constructed in early spring 1989 in Adair and Marion counties.
Resumo:
Elongated dust grains exist in astrophysical plasmas. Anisotropic growth of elliptical dust grains, via plasma deposition, occurs if the deposited ions are non-inertial. In reality the extent of such growth depends upon the initial kinetic energy of the ions and the magnitude of the electric field in the sheath. Simulations of the dynamics of the ions in the sheath are reported, showing how elliptical growth is related to the initial eccentricity and size of the seed relative to the sheath length. Consequences for the eventual fate of elliptical dust are then discussed.
Resumo:
Context. Recent observations of brown dwarf spectroscopic variability in the infrared infer the presence of patchy cloud cover. Aims. This paper proposes a mechanism for producing inhomogeneous cloud coverage due to the depletion of cloud particles through the Coulomb explosion of dust in atmospheric plasma regions. Charged dust grains Coulomb-explode when the electrostatic stress of the grain exceeds its mechanical tensile stress, which results in grains below a critical radius a < a Coul crit being broken up. Methods. This work outlines the criteria required for the Coulomb explosion of dust clouds in substellar atmospheres, the effect on the dust particle size distribution function, and the resulting radiative properties of the atmospheric regions. Results. Our results show that for an atmospheric plasma region with an electron temperature of Te = 10 eV (≈105 K), the critical grain radius varies from 10−7 to 10−4 cm, depending on the grains’ tensile strength. Higher critical radii up to 10−3 cm are attainable for higher electron temperatures. We find that the process produces a bimodal particle size distribution composed of stable nanoscale seed particles and dust particles with a ≥ a Coul crit , with the intervening particle sizes defining a region devoid of dust. As a result, the dust population is depleted, and the clouds become optically thin in the wavelength range 0.1–10 μm, with a characteristic peak that shifts to higher wavelengths as more sub-micrometer particles are destroyed. Conclusions. In an atmosphere populated with a distribution of plasma volumes, this will yield regions of contrasting radiative properties, thereby giving a source of inhomogeneous cloud coverage. The results presented here may also be relevant for dust in supernova remnants and protoplanetary disks.
Resumo:
Bikeshares promote healthy lifestyles and sustainability among commuters, casual riders, and tourists. However, the central pillar of modern systems, the bike station, cannot be easily integrated into a compact college campus. Fixed stations lack the flexibility to meet the needs of college students who make quick, short-distance trips. Additionally, the necessary cost of implementing and maintaining each station prohibits increasing the number of stations for user convenience. Therefore, the team developed a stationless bikeshare based on a smartlock permanently attached to bicycles in the system. The smartlock system design incorporates several innovative approaches to provide usability, security, and reliability that overcome the limitations of a station centered design. A focus group discussion allowed the team to receive feedback on the early lock, system, and website designs, identify improvements and craft a pleasant user experience. The team designed a unique, two-step lock system that is intuitive to operate while mitigating user error. To ensure security, user access is limited through near field ii communications (NFC) technology connected to a mechatronic release system. The said system relied on a NFC module and a servo working through an Arduino microcontroller coded in the Arduino IDE. To track rentals and maintain the system, each bike is fitted with an XBee module to communicate with a scalable ZigBee mesh network. The network allows for bidirectional, real-time communication with a Meteor.js web application, which enables user and administrator functions through an intuitive user interface available on mobile and desktop. The development of an independent smartlock to replace bike stations is essential to meet the needs of the modern college student. With the goal of creating a bikeshare that better serves college students, Team BIKES has laid the framework for a system that is affordable, easily adaptable, and implementable on any university expressing an interest in bringing a bikeshare to its campus.
Resumo:
Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. In a series of papers this process has been detailed and taken as a basis for a speci cation methodology for recon gurable systems. The present paper extends this work by showing how a proof calculus (in both a Hilbert and a tableau based format) for the hybridised version of a logic can be systematically generated from a proof calculus for the latter. Such developments provide the basis for a complete proof theory for hybrid(ised) logics, and thus pave the way to the development of (dedicated) proof support.
Resumo:
Recursion is a well-known and powerful programming technique, with a wide variety of applications. The dual technique of corecursion is less well-known, but is increasingly proving to be just as useful. This article is a tutorial on the four main methods for proving properties of corecursive programs: fixpoint induction, the approximation (or take) lemma, coinduction, and fusion.
Resumo:
Corecursive programs produce values of greatest fixpoint types, in contrast to recursive programs, which consume values of least fixpoint types. There are a number of widely used methods for proving properties of corecursive programs, including fixpoint induction, the take lemma, and coinduction. However, these methods are all rather low level, in that they do not exploit the common structure that is often present in corecursive definitions. We argue for a more structured approach to proving properties of corecursive programs. In particular, we show that by writing corecursive programs using a simple operator that encapsulates a common pattern of corecursive definition, we can then use high-level algebraic properties of this operator to conduct proofs in a purely calculational style that avoids the use of inductive or coinductive methods.
Resumo:
We present a framework for describing proof planners. This framework is based around a decomposition of proof planners into planning states, proof language, proof plans, proof methods, proof revision, proof control and planning algorithms. We use this framework to motivate the comparison of three recent proof planning systems, lclam, OMEGA and IsaPlanner, and demonstrate how the framework allows us to discuss and illustrate both their similarities and differences in a consistent fashion. This analysis reveals that proof control and the use of contextual information in planning states are key areas in need of further investigation.
Resumo:
This paper presents a generic architecture for proof planning systems in terms of an interaction between a customisable proof module and search module. These refer to both global and local information contained in reasoning states.
Resumo:
This paper reports the use of proof planning to diagnose errors in program code. In particular it looks at the errors that arise in the base cases of recursive programs produced by undergraduates. It describes two classes of error that arise in this situation. The use of test cases would catch these errors but would fail to distinguish between them. The system adapts proof critics, commonly used to patch faulty proofs, to diagnose such errors and distinguish between the two classes. It has been implemented in Lambda-clam, a proof planning system, and applied successfully to a small set of examples.
Resumo:
Proof critics are a technology from the proof planning paradigm. They examine failed proof attempts in order to extract information which can be used to generate a patch which will allow the proof to go through. We consider the proof of the $quot;whisky problem$quot;, a challenge problem from the domain of temporal logic. The proof requires a generalisation of the original conjecture and we examine two proof critics which can be used to create this generalisation. Using these critics we believe we have produced the first automatic proofs of this challenge problem. We use this example to motivate a comparison of the two critics and propose that there is a place for specialist critics as well as powerful general critics. In particular we advocate the development of critics that do not use meta-variables.