739 resultados para page layout analysis
Resumo:
Collagen fibrillation within articular cartilage (AC) plays a key role in joint osteoarthritis (OA) progression and, therefore, studying collagen synthesis changes could be an indicator for use in the assessment of OA. Various staining techniques have been developed and used to determine the collagen network transformation under microscopy. However, because collagen and proteoglycan coexist and have the same index of refraction, conventional methods for specific visualization of collagen tissue is difficult. This study aimed to develop an advanced staining technique to distinguish collagen from proteoglycan and to determine its evolution in relation to OA progression using optical and laser scanning confocal microscopy (LSCM). A number of AC samples were obtained from sheep joints, including both healthy and abnormal joints with OA grades 1 to 3. The samples were stained using two different trichrome methods and immunohistochemistry (IHC) to stain both colourimetrically and with fluorescence. Using optical microscopy and LSCM, the present authors demonstrated that the IHC technique stains collagens only, allowing the collagen network to be separated and directly investigated. Fluorescently-stained IHC samples were also subjected to LSCM to obtain three-dimensional images of the collagen fibres. Changes in the collagen fibres were then correlated with the grade of OA in tissue. This study is the first to successfully utilize the IHC staining technique in conjunction with laser scanning confocal microscopy. This is a valuable tool for assessing changes to articular cartilage in OA.
Resumo:
The Texas Transportation Commission (“the Commission”) is responsible for planning and making policies for the location, construction, and maintenance of a comprehensive system of highways and public roads in Texas. In order for the Commission to carry out its legislative mandate, the Texas Constitution requires that most revenue generated by motor vehicle registration fees and motor fuel taxes be used for constructing and maintaining public roadways and other designated purposes. The Texas Department of Transportation (TxDOT) assists the Commission in executing state transportation policy. It is the responsibility of the legislature to appropriate money for TxDOT’s operation and maintenance expenses. All money authorized to be appropriated for TxDOT’s operations must come from the State Highway Fund (also known as Fund 6, Fund 006, or Fund 0006). The Commission can then use the balance in the fund to fulfill its responsibilities. However, the value of the revenue received in Fund 6 is not keeping pace with growing demand for transportation infrastructure in Texas. Additionally, diversion of revenue to nontransportation uses now exceeds $600 million per year. As shown in Figure 1.1, revenues and expenditures of the State Highway Fund per vehicle mile traveled (VMT) in Texas have remained almost flat since 1993. In the meantime, construction cost inflation has gone up more than 100%, effectively halving the value of expenditure.
Resumo:
Analytical and closed form solutions are presented in this paper for the vibration response of an L-shaped plate under a point force or a moment excitation. Inter-relationships between wave components of the source and the receiving plates are clearly defined. Explicit expressions are given for the quadratic quantities such as input power, energy flow and kinetic energy distributions of the L-shaped plate. Applications of statistical energy analysis (SEA) formulation in the prediction of the vibration response of finite coupled plate structures under a single deterministic forcing are examined and quantified. It is found that the SEA method can be employed to predict the frequency averaged vibration response and energy flow of coupled plate structures under a deterministic force or moment excitation when the structural system satisfies the following conditions: (1) the coupling loss factors of the coupled subsystems are known; (2) the source location is more than a quarter of the plate bending wavelength away from the source plate edges in the point force excitation case, or is more than a quarter wavelength away from the pair of source plate edges perpendicular to the moment axis in the moment excitation case due to the directional characteristic of moment excitations. SEA overestimates the response of the L-shaped plate when the source location is less than a quarter bending wavelength away from the respective plate edges owing to wave coherence effect at the plate boundary
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.
Resumo:
In this contribution, a stability analysis for a dynamic voltage restorer (DVR) connected to a weak ac system containing a dynamic load is presented using continuation techniques and bifurcation theory. The system dynamics are explored through the continuation of periodic solutions of the associated dynamic equations. The switching process in the DVR converter is taken into account to trace the stability regions through a suitable mathematical representation of the DVR converter. The stability regions in the Thevenin equivalent plane are computed. In addition, the stability regions in the control gains space, as well as the contour lines for different Floquet multipliers, are computed. Besides, the DVR converter model employed in this contribution avoids the necessity of developing very complicated iterative map approaches as in the conventional bifurcation analysis of converters. The continuation method and the DVR model can take into account dynamics and nonlinear loads and any network topology since the analysis is carried out directly from the state space equations. The bifurcation approach is shown to be both computationally efficient and robust, since it eliminates the need for numerically critical and long-lasting transient simulations.
Resumo:
Mathematics education literature has called for an abandonment of ontological and epistemological ideologies that have often divided theory-based practice. Instead, a consilience of theories has been sought which would leverage the strengths of each learning theory and so positively impact upon contemporary educational practice. This research activity is based upon Popper’s notion of three knowledge worlds which differentiates the knowledge shared in a community from the personal knowledge of the individual, and Bereiter’s characterisation of understanding as the individual’s relationship to tool-like knowledge. Using these notions, a re-conceptualisation of knowledge and understanding and a subsequent re-consideration of learning theories are proposed as a way to address the challenge set by literature. Referred to as the alternative theoretical framework, the proposed theory accounts for the scaffolded transformation of each individual’s unique understanding, whilst acknowledging the existence of a body of domain knowledge shared amongst participants in a scientific community of practice. The alternative theoretical framework is embodied within an operational model that is accompanied by a visual nomenclature with which to describe consensually developed shared knowledge and personal understanding. This research activity has sought to iteratively evaluate this proposed theory through the practical application of the operational model and visual nomenclature to the domain of early-number counting, addition and subtraction. This domain of mathematical knowledge has been comprehensively analysed and described. Through this process, the viability of the proposed theory as a tool with which to discuss and thus improve the knowledge and understanding with the domain of mathematics has been validated. Putting of the proposed theory into practice has lead to the theory’s refinement and the subsequent achievement of a solid theoretical base for the future development of educational tools to support teaching and learning practice, including computer-mediated learning environments. Such future activity, using the proposed theory, will advance contemporary mathematics educational practice by bringing together the strengths of cognitivist, constructivist and post-constructivist learning theories.
Resumo:
A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.
Resumo:
Urban water quality can be significantly impaired by the build-up of pollutants such as heavy metals and volatile organics on urban road surfaces due to vehicular traffic. Any control strategy for the mitigation of traffic related build-up of heavy metals and volatile organic pollutants should be based on the knowledge of their build-up processes. In the study discussed in this paper, the outcomes of a detailed experiment investigation into build-up processes of heavy metals and volatile organics are presented. It was found that traffic parameters such as average daily traffic, volume over capacity ratio and surface texture depth had similar strong correlations with the build-up of heavy metals and volatile organics. Multicriteria decision analyses revealed that the 1 - 74 um particulate fraction of total suspended solids (TSS) could be regarded as a surrogate indicator for particulate heavy metals in build-up and this same fraction of total organic carbon could be regarded as a surrogate indicator for particulate volatile organics build-up. In terms of pollutants affinity, TSS was found to be the predominant parameter for particulate heavy metals build-up and total dissolved solids was found to be the predominant parameter for he potential dissolved particulate fraction in heavy metals build-up. It was also found that land use did not play a significant role in the build-up of traffic generated heavy metals and volatile organics.
Resumo:
This paper demonstrates the capabilities of wavelet transform (WT) for analyzing important features related to bottleneck activations and traffic oscillations in congested traffic in a systematic manner. In particular, the analysis of loop detector data from a freeway shows that the use of wavelet-based energy can effectively identify the location of an active bottleneck, the arrival time of the resulting queue at each upstream sensor location, and the start and end of a transition during the onset of a queue. Vehicle trajectories were also analyzed using WT and our analysis shows that the wavelet-based energies of individual vehicles can effectively detect the origins of deceleration waves and shed light on possible triggers (e.g., lane-changing). The spatiotemporal propagations of oscillations identified by tracing wavelet-based energy peaks from vehicle to vehicle enable analysis of oscillation amplitude, duration and intensity.
Resumo:
In this paper we identify the origins of stop-and-go (or slow-and-go) driving and measure microscopic features of their propagations by analyzing vehicle trajectories via Wavelet Transform. Based on 53 oscillation cases analyzed, we find that oscillations can be originated by either lane-changing maneuvers (LCMs) or car-following behavior (CF). LCMs were predominantly responsible for oscillation formations in the absence of considerable horizontal or vertical curves, whereas oscillations formed spontaneously near roadside work on an uphill segment. Regardless of the trigger, the features of oscillation propagations were similar in terms of propagation speed, oscillation duration, and amplitude. All observed cases initially exhibited a precursor phase, in which slow-and-go motions were localized. Some of them eventually transitioned into a well developed phase, in which oscillations propagated upstream in queue. LCMs were primarily responsible for the transition, although some transitions occurred without LCMs. Our findings also suggest that an oscillation has a regressive effect on car following behavior: a deceleration wave of an oscillation affects a timid driver (with larger response time and minimum spacing) to become less timid and an aggressive driver less aggressive, although this change may be short-lived. An extended framework of Newell’s CF is able to describe the regressive effects with two additional parameters with reasonable accuracy, as verified using vehicle trajectory data.
Resumo:
Analyzing security protocols is an ongoing research in the last years. Different types of tools are developed to make the analysis process more precise, fast and easy. These tools consider security protocols as black boxes that can not easily be composed. It is difficult or impossible to do a low-level analysis or combine different tools with each other using these tools. This research uses Coloured Petri Nets (CPN) to analyze OSAP trusted computing protocol. The OSAP protocol is modeled in different levels and it is analyzed using state space method. The produced model can be combined with other trusted computing protocols in future works.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.
Resumo:
Variable Speed Limits (VSL) is a control tool of Intelligent Transportation Systems (ITS) which can enhance traffic safety and which has the potential to contribute to traffic efficiency. This study presents the results of a calibration and operational analysis of a candidate VSL algorithm for high flow conditions on an urban motorway of Queensland, Australia. The analysis was done using a framework consisting of a microscopic simulation model combined with runtime API and a proposed efficiency index. The operational analysis includes impacts on speed-flow curve, travel time, speed deviation, fuel consumption and emission.
Resumo:
The seawater neutralisation process is currently used in the Alumina industry to reduce the pH and dissolved metal concentrations in bauxite refinery residues, through the precipitation of Mg, Al, and Ca hydroxide and carbonate minerals. This neutralisation method is very similar to the co-precipitation method used to synthesise hydrotalcite (Mg6Al2(OH)16CO3•4H2O). This study looks at the effect of temperature on the type of precipitates that form from the seawater neutralisation process of Bayer liquor. The Bayer precipitates have been characterised by a variety of techniques, including X-ray diffraction, Raman spectroscopy and infrared spectroscopy. The mineralogical composition of Bayer precipitates largely includes hydrotalcite, hydromagnesite, and calcium carbonate species. XRD determined that Bayer hydrotalcites that are synthesised at 55 °C have a larger interlayer distance, indicating more anions are removed from Bayer liquor. Vibrational spectroscopic techniques have identified an increase in hydrogen bond strength for precipitates formed at 55 °C, suggesting the formation of a more stable Bayer hydrotalcite. Raman spectroscopy identified the intercalation of sulfate and carbonate anions into Bayer hydrotalcites using these synthesis conditions.