981 resultados para Economical energetic analysis
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.
Resumo:
In this contribution, a stability analysis for a dynamic voltage restorer (DVR) connected to a weak ac system containing a dynamic load is presented using continuation techniques and bifurcation theory. The system dynamics are explored through the continuation of periodic solutions of the associated dynamic equations. The switching process in the DVR converter is taken into account to trace the stability regions through a suitable mathematical representation of the DVR converter. The stability regions in the Thevenin equivalent plane are computed. In addition, the stability regions in the control gains space, as well as the contour lines for different Floquet multipliers, are computed. Besides, the DVR converter model employed in this contribution avoids the necessity of developing very complicated iterative map approaches as in the conventional bifurcation analysis of converters. The continuation method and the DVR model can take into account dynamics and nonlinear loads and any network topology since the analysis is carried out directly from the state space equations. The bifurcation approach is shown to be both computationally efficient and robust, since it eliminates the need for numerically critical and long-lasting transient simulations.
Resumo:
Mathematics education literature has called for an abandonment of ontological and epistemological ideologies that have often divided theory-based practice. Instead, a consilience of theories has been sought which would leverage the strengths of each learning theory and so positively impact upon contemporary educational practice. This research activity is based upon Popper’s notion of three knowledge worlds which differentiates the knowledge shared in a community from the personal knowledge of the individual, and Bereiter’s characterisation of understanding as the individual’s relationship to tool-like knowledge. Using these notions, a re-conceptualisation of knowledge and understanding and a subsequent re-consideration of learning theories are proposed as a way to address the challenge set by literature. Referred to as the alternative theoretical framework, the proposed theory accounts for the scaffolded transformation of each individual’s unique understanding, whilst acknowledging the existence of a body of domain knowledge shared amongst participants in a scientific community of practice. The alternative theoretical framework is embodied within an operational model that is accompanied by a visual nomenclature with which to describe consensually developed shared knowledge and personal understanding. This research activity has sought to iteratively evaluate this proposed theory through the practical application of the operational model and visual nomenclature to the domain of early-number counting, addition and subtraction. This domain of mathematical knowledge has been comprehensively analysed and described. Through this process, the viability of the proposed theory as a tool with which to discuss and thus improve the knowledge and understanding with the domain of mathematics has been validated. Putting of the proposed theory into practice has lead to the theory’s refinement and the subsequent achievement of a solid theoretical base for the future development of educational tools to support teaching and learning practice, including computer-mediated learning environments. Such future activity, using the proposed theory, will advance contemporary mathematics educational practice by bringing together the strengths of cognitivist, constructivist and post-constructivist learning theories.
Resumo:
A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.
Resumo:
Urban water quality can be significantly impaired by the build-up of pollutants such as heavy metals and volatile organics on urban road surfaces due to vehicular traffic. Any control strategy for the mitigation of traffic related build-up of heavy metals and volatile organic pollutants should be based on the knowledge of their build-up processes. In the study discussed in this paper, the outcomes of a detailed experiment investigation into build-up processes of heavy metals and volatile organics are presented. It was found that traffic parameters such as average daily traffic, volume over capacity ratio and surface texture depth had similar strong correlations with the build-up of heavy metals and volatile organics. Multicriteria decision analyses revealed that the 1 - 74 um particulate fraction of total suspended solids (TSS) could be regarded as a surrogate indicator for particulate heavy metals in build-up and this same fraction of total organic carbon could be regarded as a surrogate indicator for particulate volatile organics build-up. In terms of pollutants affinity, TSS was found to be the predominant parameter for particulate heavy metals build-up and total dissolved solids was found to be the predominant parameter for he potential dissolved particulate fraction in heavy metals build-up. It was also found that land use did not play a significant role in the build-up of traffic generated heavy metals and volatile organics.
Resumo:
This paper demonstrates the capabilities of wavelet transform (WT) for analyzing important features related to bottleneck activations and traffic oscillations in congested traffic in a systematic manner. In particular, the analysis of loop detector data from a freeway shows that the use of wavelet-based energy can effectively identify the location of an active bottleneck, the arrival time of the resulting queue at each upstream sensor location, and the start and end of a transition during the onset of a queue. Vehicle trajectories were also analyzed using WT and our analysis shows that the wavelet-based energies of individual vehicles can effectively detect the origins of deceleration waves and shed light on possible triggers (e.g., lane-changing). The spatiotemporal propagations of oscillations identified by tracing wavelet-based energy peaks from vehicle to vehicle enable analysis of oscillation amplitude, duration and intensity.
Resumo:
In this paper we identify the origins of stop-and-go (or slow-and-go) driving and measure microscopic features of their propagations by analyzing vehicle trajectories via Wavelet Transform. Based on 53 oscillation cases analyzed, we find that oscillations can be originated by either lane-changing maneuvers (LCMs) or car-following behavior (CF). LCMs were predominantly responsible for oscillation formations in the absence of considerable horizontal or vertical curves, whereas oscillations formed spontaneously near roadside work on an uphill segment. Regardless of the trigger, the features of oscillation propagations were similar in terms of propagation speed, oscillation duration, and amplitude. All observed cases initially exhibited a precursor phase, in which slow-and-go motions were localized. Some of them eventually transitioned into a well developed phase, in which oscillations propagated upstream in queue. LCMs were primarily responsible for the transition, although some transitions occurred without LCMs. Our findings also suggest that an oscillation has a regressive effect on car following behavior: a deceleration wave of an oscillation affects a timid driver (with larger response time and minimum spacing) to become less timid and an aggressive driver less aggressive, although this change may be short-lived. An extended framework of Newell’s CF is able to describe the regressive effects with two additional parameters with reasonable accuracy, as verified using vehicle trajectory data.
Resumo:
Analyzing security protocols is an ongoing research in the last years. Different types of tools are developed to make the analysis process more precise, fast and easy. These tools consider security protocols as black boxes that can not easily be composed. It is difficult or impossible to do a low-level analysis or combine different tools with each other using these tools. This research uses Coloured Petri Nets (CPN) to analyze OSAP trusted computing protocol. The OSAP protocol is modeled in different levels and it is analyzed using state space method. The produced model can be combined with other trusted computing protocols in future works.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.
Resumo:
Variable Speed Limits (VSL) is a control tool of Intelligent Transportation Systems (ITS) which can enhance traffic safety and which has the potential to contribute to traffic efficiency. This study presents the results of a calibration and operational analysis of a candidate VSL algorithm for high flow conditions on an urban motorway of Queensland, Australia. The analysis was done using a framework consisting of a microscopic simulation model combined with runtime API and a proposed efficiency index. The operational analysis includes impacts on speed-flow curve, travel time, speed deviation, fuel consumption and emission.
Resumo:
The seawater neutralisation process is currently used in the Alumina industry to reduce the pH and dissolved metal concentrations in bauxite refinery residues, through the precipitation of Mg, Al, and Ca hydroxide and carbonate minerals. This neutralisation method is very similar to the co-precipitation method used to synthesise hydrotalcite (Mg6Al2(OH)16CO3•4H2O). This study looks at the effect of temperature on the type of precipitates that form from the seawater neutralisation process of Bayer liquor. The Bayer precipitates have been characterised by a variety of techniques, including X-ray diffraction, Raman spectroscopy and infrared spectroscopy. The mineralogical composition of Bayer precipitates largely includes hydrotalcite, hydromagnesite, and calcium carbonate species. XRD determined that Bayer hydrotalcites that are synthesised at 55 °C have a larger interlayer distance, indicating more anions are removed from Bayer liquor. Vibrational spectroscopic techniques have identified an increase in hydrogen bond strength for precipitates formed at 55 °C, suggesting the formation of a more stable Bayer hydrotalcite. Raman spectroscopy identified the intercalation of sulfate and carbonate anions into Bayer hydrotalcites using these synthesis conditions.
Resumo:
The mechanism for the decomposition of hydrotalcite remains unsolved. Controlled rate thermal analysis enables this decomposition pathway to be explored. The thermal decomposition of hydrotalcites with hexacyanoferrite(II) and hexacyanoferrate(III) in the interlayer has been studied using controlled rate thermal analysis technology. X-ray diffraction shows the hydrotalcites studied have a d(003) spacing of 11.1 and 10.9 Å which compares with a d-spacing of 7.9 and 7.98 Å for the hydrotalcite with carbonate or sulphate in the interlayer. Calculations based upon CRTA measurements show that 7 moles of water is lost, proving the formula of hexacyanoferrite(II) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.5 .7 H2O and for the hexacyanoferrate(III) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.66 * 9 H2O. Dehydroxylation combined with CN unit loss occurs in three steps between a) 310 and 367°C b) 367 and 390°C and c) between 390 and 428°C for both the hexacyanoferrite(II) and hexacyanoferrate(III) intercalated hydrotalcite.
Resumo:
Spatially offset Raman spectroscopy (SORS) is a powerful new technique for the non-invasive detection and identification of concealed substances and drugs. Here, we demonstrate the SORS technique in several scenarios that are relevant to customs screening, postal screening, drug detection and forensics applications. The examples include analysis of a multi-layered postal package to identify a concealed substance; identification of an antibiotic capsule inside its plastic blister pack; analysis of an envelope containing a powder; and identification of a drug dissolved in a clear solvent, contained in a non-transparent plastic bottle. As well as providing practical examples of SORS, the results highlight several considerations regarding the use of SORS in the field, including the advantages of different analysis geometries and the ability to tailor instrument parameters and optics to suit different types of packages and samples. We also discuss the features and benefits of SORS in relation to existing Raman techniques, including confocal microscopy, wide area illumination and the conventional backscattered Raman spectroscopy. The results will contribute to the recognition of SORS as a promising method for the rapid, chemically-specific analysis and detection of drugs and pharmaceuticals.