921 resultados para Operating Limit
Resumo:
Due to the importance and wide applications of the DNA analysis, there is a need to make genetic analysis more available and more affordable. As such, the aim of this PhD thesis is to optimize a colorimetric DNA biosensor based on gold nanoprobes developed in CEMOP by reducing its price and the needed volume of solution without compromising the device sensitivity and reliability, towards the point of care use. Firstly, the price of the biosensor was decreased by replacing the silicon photodetector by a low cost, solution processed TiO2 photodetector. To further reduce the photodetector price, a novel fabrication method was developed: a cost-effective inkjet printing technology that enabled to increase TiO2 surface area. Secondly, the DNA biosensor was optimized by means of microfluidics that offer advantages of miniaturization, much lower sample/reagents consumption, enhanced system performance and functionality by integrating different components. In the developed microfluidic platform, the optical path length was extended by detecting along the channel and the light was transmitted by optical fibres enabling to guide the light very close to the analysed solution. Microfluidic chip of high aspect ratio (~13), smooth and nearly vertical sidewalls was fabricated in PDMS using a SU-8 mould for patterning. The platform coupled to the gold nanoprobe assay enabled detection of Mycobacterium tuberculosis using 3 8l on DNA solution, i.e. 20 times less than in the previous state-of-the-art. Subsequently, the bio-microfluidic platform was optimized in terms of cost, electrical signal processing and sensitivity to colour variation, yielding 160% improvement of colorimetric AuNPs analysis. Planar microlenses were incorporated to converge light into the sample and then to the output fibre core increasing 6 times the signal-to-losses ratio. The optimized platform enabled detection of single nucleotide polymorphism related with obesity risk (FTO) using target DNA concentration below the limit of detection of the conventionally used microplate reader (i.e. 15 ng/μl) with 10 times lower solution volume (3 μl). The combination of the unique optical properties of gold nanoprobes with microfluidic platform resulted in sensitive and accurate sensor for single nucleotide polymorphism detection operating using small volumes of solutions and without the need for substrate functionalization or sophisticated instrumentation. Simultaneously, to enable on chip reagents mixing, a PDMS micromixer was developed and optimized for the highest efficiency, low pressure drop and short mixing length. The optimized device shows 80% of mixing efficiency at Re = 0.1 in 2.5 mm long mixer with the pressure drop of 6 Pa, satisfying requirements for the application in the microfluidic platform for DNA analysis.
Resumo:
NSBE - UNL
Resumo:
Introduction Herpes simplex virus (HSV) and varicella zoster virus (VZV) are responsible for a variety of human diseases, including central nervous system diseases. The use of polymerase chain reaction (PCR) techniques on cerebrospinal fluid samples has allowed the detection of viral DNA with high sensitivity and specificity. Methods Serial dilutions of quantified commercial controls of each virus were subjected to an in-house nested-PCR technique. Results The minimum detection limits for HSV and VZV were 5 and 10 copies/µL, respectively. Conclusions The detection limit of nested-PCR for HSV and VZV in this study was similar to the limits found in previous studies.
Resumo:
Research literature and regulators are unconditional in pointing the disclosure of operating cash flow through direct method a section of unique information. Besides the intuitive facet, it is also consistent in forecasting future operating cash flows and a cohesive piece to financial statement puzzle. Bearing this in mind, I produce an analysis on the usefulness and predictive ability on the disclosure of gross cash receipts and payments over the disclosure of reconciliation between net income and accruals for two markets with special features, Portugal and Spain. Results validate the usefulness of direct method format in predicting future operating cash flow. Key
Resumo:
Since the last decade of the twentieth century, the healthcare industry is paying attention to the environmental impact of their buildings and therefore new regulations, policy goals and Buildings Sustainability Assessment (HBSA) methods are being developed and implemented. At the present, healthcare is one of the most regulated industries and it is also one of the largest consumers of energy per net floor area. To assess the sustainability of healthcare buildings it is necessary to establish a set of benchmarks related with their life-cycle performance. They are both essential to rate the sustainability of a project and to support designers and other stakeholders in the process of designing and operating a sustainable building, by allowing the comparison to be made between a project and the conventional and best market practices. This research is focused on the methodology to set the benchmarks for resources consumption, waste production, operation costs and potential environmental impacts related to the operational phase of healthcare buildings. It aims at contributing to the reduction of the subjectivity found in the definition of the benchmarks used in Building Sustainability Assessment (BSA) methods, and it is applied in the Portuguese context. These benchmarks will be used in the development of a Portuguese HBSA method.
Resumo:
We investigate the long-term performance of cross-delisted firms from U.S. stock markets. Using a sample of foreign firms listed and delisted from U.S. stock exchange markets over 2000-2012, we examine the operating performance and the long-run stock returns performance of firms post-cross-delisting. Our results suggest that cross-delisted firms have less growth opportunities than matched cross-listed firms in the long run. Moreover, firms that cross-delist after the passage of Rule 12h-6 of 2007 exhibit a significant decline in operating performance. In contrast, before the adoption of the Rule 12h-6, cross-delisted firms seem to be affected by the cost of a U.S. listing in the precross -delisting period. In addition, we provide evidence that cross-delisted firms underperform their cross-listed peers; cross-delisted firms experience negative average abnormal returns, especially in the post-delisting period.
Resumo:
Dissertação de mestrado em Bioinformática
Resumo:
A search has been performed for pair production of heavy vector-like down-type (B) quarks. The analysis explores the lepton-plus-jets final state, characterized by events with one isolated charged lepton (electron or muon), significant missing transverse momentum and multiple jets. One or more jets are required to be tagged as arising from b-quarks, and at least one pair of jets must be tagged as arising from the hadronic decay of an electroweak boson. The analysis uses the full data sample of pp collisions recorded in 2012 by the ATLAS detector at the LHC, operating at a center-of-mass energy of 8 TeV, corresponding to an integrated luminosity of 20.3 fb−1. No significant excess of events is observed above the expected background. Limits are set on vector-like B production, as a function of the B branching ratios, assuming the allowable decay modes are B→Wt/Zb/Hb. In the chiral limit with a branching ratio of 100% for the decay B→Wt, the observed (expected) 95% CL lower limit on the vector-like B mass is 810 GeV (760 GeV). In the case where the vector-like B quark has branching ratio values corresponding to those of an SU(2) singlet state, the observed (expected) 95% CL lower limit on the vector-like B mass is 640 GeV (505 GeV). The same analysis, when used to investigate pair production of a colored, charge 5/3 exotic fermion T5/3, with subsequent decay T5/3→Wt, sets an observed (expected) 95% CL lower limit on the T5/3 mass of 840 GeV (780 GeV).
Resumo:
This paper presents a model predictive current control applied to a proposed single-phase five-level active rectifier (FLAR). This current control strategy uses the discrete-time nature of the active rectifier to define its state in each sampling interval. Although the switching frequency is not constant, this current control strategy allows to follow the reference with low total harmonic distortion (THDF). The implementation of the active rectifier that was used to obtain the experimental results is described in detail along the paper, presenting the circuit topology, the principle of operation, the power theory, and the current control strategy. The experimental results confirm the robustness and good performance (with low current THDF and controlled output voltage) of the proposed single-phase FLAR operating with model predictive current control.
Resumo:
Tese de Doutoramento em Ciências Empresariais.
Resumo:
The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.
Resumo:
Direct methanol fuel cell, DMFC, model, mass transport, Maxwell-Stefan, Flory-Huggins, crossover, polymer electrolyte membrane, Nafion
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt
Resumo:
The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.