964 resultados para Technical Analysis
Resumo:
An approach to incorporate spatial dependence into stochastic frontier analysis is developed and applied to a sample of 215 dairy farms in England and Wales. A number of alternative specifications for the spatial weight matrix are used to analyse the effect of these on the estimation of spatial dependence. Estimation is conducted using a Bayesian approach and results indicate that spatial dependence is present when explaining technical inefficiency.
Resumo:
Air distribution systems are one of the major electrical energy consumers in air-conditioned commercial buildings which maintain comfortable indoor thermal environment and air quality by supplying specified amounts of treated air into different zones. The sizes of air distribution lines affect energy efficiency of the distribution systems. Equal friction and static regain are two well-known approaches for sizing the air distribution lines. Concerns to life cycle cost of the air distribution systems, T and IPS methods have been developed. Hitherto, all these methods are based on static design conditions. Therefore, dynamic performance of the system has not been yet addressed; whereas, the air distribution systems are mostly performed in dynamic rather than static conditions. Besides, none of the existing methods consider any aspects of thermal comfort and environmental impacts. This study attempts to investigate the existing methods for sizing of the air distribution systems and proposes a dynamic approach for size optimisation of the air distribution lines by taking into account optimisation criteria such as economic aspects, environmental impacts and technical performance. These criteria have been respectively addressed through whole life costing analysis, life cycle assessment and deviation from set-point temperature of different zones. Integration of these criteria into the TRNSYS software produces a novel dynamic optimisation approach for duct sizing. Due to the integration of different criteria into a well- known performance evaluation software, this approach could be easily adopted by designers in busy nature of design. Comparison of this integrated approach with the existing methods reveals that under the defined criteria, system performance is improved up to 15% compared to the existing methods. This approach is interpreted as a significant step forward reaching to the net zero emission building in future.
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
Resumo:
The past decade has witnessed a sharp increase in published research on energy and buildings. This paper takes stock of work in this area, with a particular focus on construction research and the analysis of non-technical dimensions. While there is widespread recognition as to the importance of non-technical dimensions, research tends to be limited to individualistic studies of occupants and occupant behavior. In contrast, publications in the mainstream social science literature display a broader range of interests, including policy developments, structural constraints on the diffusion and use of new technologies and the construction process itself. The growing interest of more generalist scholars in energy and buildings provides an opportunity for construction research to engage a wider audience. This would enrich the current research agenda, helping to address unanswered problems concerning the relatively weak impact of policy mechanisms and new technologies and the seeming recalcitrance of occupants. It would also help to promote the academic status of construction research as a field. This, in turn, depends on greater engagement with interpretivist types of analysis and theory building, thereby challenging deeply ingrained views on the nature and role of academic research in construction.
Resumo:
A simple procedure was developed for packing PicoFrit HPLC columns with chromatographic stationary phase using a reservoir fabricated from standard laboratory HPLC fittings. Packed columns were mounted onto a stainless steel ultra-low volume precolumn filter assembly containing a 0.5-mu m pore size steel frit. This format provided a conduit for the application of the nanospray voltage and protected the column from obstruction by sample material. The system was characterised and operational performance assessed by analysis of a range of peptide standards (n = 9).
Resumo:
The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.
Resumo:
Background: Affymetrix GeneChip arrays are widely used for transcriptomic studies in a diverse range of species. Each gene is represented on a GeneChip array by a probe- set, consisting of up to 16 probe-pairs. Signal intensities across probe- pairs within a probe-set vary in part due to different physical hybridisation characteristics of individual probes with their target labelled transcripts. We have previously developed a technique to study the transcriptomes of heterologous species based on hybridising genomic DNA (gDNA) to a GeneChip array designed for a different species, and subsequently using only those probes with good homology. Results: Here we have investigated the effects of hybridising homologous species gDNA to study the transcriptomes of species for which the arrays have been designed. Genomic DNA from Arabidopsis thaliana and rice (Oryza sativa) were hybridised to the Affymetrix Arabidopsis ATH1 and Rice Genome GeneChip arrays respectively. Probe selection based on gDNA hybridisation intensity increased the number of genes identified as significantly differentially expressed in two published studies of Arabidopsis development, and optimised the analysis of technical replicates obtained from pooled samples of RNA from rice. Conclusion: This mixed physical and bioinformatics approach can be used to optimise estimates of gene expression when using GeneChip arrays.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
With the development of convection-permitting numerical weather prediction the efficient use of high resolution observations in data assimilation is becoming increasingly important. The operational assimilation of these observations, such as Dopplerradar radial winds, is now common, though to avoid violating the assumption of un- correlated observation errors the observation density is severely reduced. To improve the quantity of observations used and the impact that they have on the forecast will require the introduction of the full, potentially correlated, error statistics. In this work, observation error statistics are calculated for the Doppler radar radial winds that are assimilated into the Met Office high resolution UK model using a diagnostic that makes use of statistical averages of observation-minus-background and observation-minus-analysis residuals. This is the first in-depth study using the diagnostic to estimate both horizontal and along-beam correlated observation errors. By considering the new results obtained it is found that the Doppler radar radial wind error standard deviations are similar to those used operationally and increase as the observation height increases. Surprisingly the estimated observation error correlation length scales are longer than the operational thinning distance. They are dependent on both the height of the observation and on the distance of the observation away from the radar. Further tests show that the long correlations cannot be attributed to the use of superobservations or the background error covariance matrix used in the assimilation. The large horizontal correlation length scales are, however, in part, a result of using a simplified observation operator.
Resumo:
Aligning information systems (IS) solutions with business goals and needs are crucial for IS activities. IS professionals who are able to work closely with both the business and technical staff are key enablers of business and IT alignment. IS programs in higher education (HE) institutions have a long tradition of enabling graduates to develop the appropriate skills needed for their future careers. Yet, organizations are still having difficulty finding graduates who possess both the knowledge and skills that are best suited to their specific requirements. Prior studies suggest that IS curricula are often ill-matched with industry/business needs. This study reports on the business analysis curricula (re) design which was undertaken to align it with a key professional body for the IS industry. This study presents the approaches taken in the (re) design of the module, and provides a discussion of the wider implications for IS curricula design. The results show a positive outcome for the HE and professional body partnership.
Resumo:
The prefrontal cortex executes important functions such as differentiation of conflicting thoughts, correct social behavior and personality expression, and is directly implicated in different neurodegenerative diseases. We performed a shotgun proteome analysis that included IEF fractionation, RP-LC, and MALDI-TOF/TOF mass spectrometric analysis of tryptic digests from a pool of seven human dorsolateral prefrontal cortex protein extracts. In this report, we present a catalog of 387 proteins expressed in these samples, identified by two or more peptides and high confidence search scores. These proteins are involved in different biological processes such as cell growth and/or maintenance, metabolism/energy pathways, cell communication/signal trarisduction, protein metabolism, transport, regulation of nucleobase, nucleoside, nucleotide and nucleic acid metabolism, and immune response. This analysis contributes to the knowledge of the human brain proteome by adding sample diversity and protein expression data from an alternative technical approach. It will also aid comparative studies of different brain areas and medical conditions, with future applications in basic and clinical research.