921 resultados para Built-in test
Resumo:
Greenhouse cultivation is an energy intensive process therefore it is worthwhile to introduce energy saving measures and alternative energy sources. Here we show that there is scope for energy saving in fan ventilated greenhouses. Measurements of electricity usage as a function of fan speed have been performed for two models of 1.25 m diameter greenhouse fans and compared to theoretical values. Reducing the speed can cut the energy usage per volume of air moved by more than 70%. To minimize the capital cost of low-speed operation, a cooled greenhouse has been built in which the fan speed responds to sunlight such that full speed is reached only around noon. The energy saving is about 40% compared to constant speed operation. Direct operation of fans from solar-photovoltaic modules is also viable as shown from experiments with a fan driven by a brushless DC motor. On comparing the Net Present Value costs of the different systems over a 10 year amortization period (with and without a carbon tax to represent environmental costs) we find that sunlight-controlled system saves money under all assumptions about taxation and discount rates. The solar-powered system, however, is only profitable for very low discount rates, due to the high initial capital costs. Nonetheless this system could be of interest for its reliability in developing countries where mains electricity is intermittent. We recommend that greenhouse fan manufacturers improve the availability of energy-saving designs such as those described here.
Resumo:
We consider data losses in a single node of a packet- switched Internet-like network. We employ two distinct models, one with discrete and the other with continuous one-dimensional random walks, representing the state of a queue in a router. Both models have a built-in critical behavior with a sharp transition from exponentially small to finite losses. It turns out that the finite capacity of a buffer and the packet-dropping procedure give rise to specific boundary conditions which lead to strong loss rate fluctuations at the critical point even in the absence of such fluctuations in the data arrival process.
Effect of a commercially available warm compress on eyelid temperature and tear film in healthy eyes
Resumo:
Purpose: To evaluate eyelid temperature change and short-term effects on tear film stability and lipid layer thickness in healthy patients using a commercially available warm compress (MGDRx EyeBag) for ophthalmic use. Methods: Eyelid temperature, noninvasive tear film breakup time (NITBUT), and tear film lipid layer thickness (TFLLT) of 22 healthy subjects were measured at baseline, immediately after, and 10 minutes after application of a heated eyebag for 5 minutes to one eye selected at random. A nonheated eyebag was applied to the contralateral eye as a control. Results: Eyelid temperatures, NITBUT, and TFLLT increased significantly from baseline in test eyes immediately after removal of the heated eyebag compared with those in control eyes (maximum temperature change, 2.3 +/- 1.2[degrees]C vs. 0.3 +/- 0.5[degrees]C, F = 20.533, p < 0.001; NITBUT change, 4.0 +/- 2.3 seconds vs. 0.4 +/- 1.7 seconds, p < 0.001; TFLLT change, 2.0 +/- 0.9 grades vs. 0.1 +/- 0.4 grades, Z = -4.035, p < 0.001). After 10 minutes, measurements remained significantly higher than those in controls (maximum temperature change, 1.0 +/- 0.7[degrees]C vs. 0.1 +/- 0.3[degrees]C, F = 14.247, p < 0.001; NITBUT change, 3.6 +/- 2.1 seconds vs. 0.1 +/- 1.9 seconds, p < 0.001; TFLLT change, 1.5 +/- 0.9 vs. 0.2 +/- 0.5 grades, Z = -3.835, p < 0.001). No adverse events occurred during the study. Conclusions: The MGDRx EyeBag is a simple device for heating the eyelids, resulting in increased NITBUT and TFLLT in subjects without meibomian gland dysfunction that seem to be clinically significant. Future studies are required to determine clinical efficacy and evaluate safety after long-term therapy in meibomian gland dysfunction patients. © 2013 American Academy of Optometry
Effect of a commercially available warm compress on eyelid temperature and tear film in healthy eyes
Resumo:
PURPOSE: To evaluate eyelid temperature change and short-term effects on tear film stability and lipid layer thickness in healthy patients using a commercially available warm compress (MGDRx EyeBag) for ophthalmic use. METHODS: Eyelid temperature, noninvasive tear film breakup time (NITBUT), and tear film lipid layer thickness (TFLLT) of 22 healthy subjects were measured at baseline, immediately after, and 10 minutes after application of a heated eyebag for 5 minutes to one eye selected at random. A nonheated eyebag was applied to the contralateral eye as a control. RESULTS: Eyelid temperatures, NITBUT, and TFLLT increased significantly from baseline in test eyes immediately after removal of the heated eyebag compared with those in control eyes (maximum temperature change, 2.3 ± 1.2 °C vs. 0.3 ± 0.5 °C, F = 20.533, p <0.001; NITBUT change, 4.0 ± 2.3 seconds vs. 0.4 ± 1.7 seconds, p <0.001; TFLLT change, 2.0 ± 0.9 grades vs. 0.1 ± 0.4 grades, Z = -4.035, p <0.001). After 10 minutes, measurements remained significantly higher than those in controls (maximum temperature change, 1.0 ± 0.7 °C vs. 0.1 ± 0.3 °C, F = 14.247, p <0.001; NITBUT change, 3.6 ± 2.1 seconds vs. 0.1 ± 1.9 seconds, p <0.001; TFLLT change, 1.5 ± 0.9 vs. 0.2 ± 0.5 grades, Z = -3.835, p <0.001). No adverse events occurred during the study. CONCLUSIONS: The MGDRx EyeBag is a simple device for heating the eyelids, resulting in increased NITBUT and TFLLT in subjects without meibomian gland dysfunction that seem to be clinically significant. Future studies are required to determine clinical efficacy and evaluate safety after long-term therapy in meibomian gland dysfunction patients. Copyright © 2014 American Academy of Optometry.
Resumo:
The present article discusses units of measure and their base units, work environments built in the Units package of the computer algebra system Maple. An analysis is drawn of the tools of the application in connection with the use of physical quantities and their features. Maple’s main commands are arranged in groups depending on the function. Some applied mathematical problems are given as examples making use of derivative, integral and differential equations.
Resumo:
2000 Mathematics Subject Classification: C2P99.
Resumo:
Optical-phase conjugation nonlinearity compensation (OPC-NLC) in optical networks is evaluated using a built-in tool including self-channel and crosstalk channel interference effects. Though significant improvements are observed, a further refined launch power policy is required to fully take advantage of OPC-NLC capability.
Resumo:
This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.
Resumo:
With the advantages and popularity of Permanent Magnet (PM) motors due to their high power density, there is an increasing incentive to use them in variety of applications including electric actuation. These applications have strict noise emission standards. The generation of audible noise and associated vibration modes are characteristics of all electric motors, it is especially problematic in low speed sensorless control rotary actuation applications using high frequency voltage injection technique. This dissertation is aimed at solving the problem of optimizing the sensorless control algorithm for low noise and vibration while achieving at least 12 bit absolute accuracy for speed and position control. The low speed sensorless algorithm is simulated using an improved Phase Variable Model, developed and implemented in a hardware-in-the-loop prototyping environment. Two experimental testbeds were developed and built to test and verify the algorithm in real time.^ A neural network based modeling approach was used to predict the audible noise due to the high frequency injected carrier signal. This model was created based on noise measurements in an especially built chamber. The developed noise model is then integrated into the high frequency based sensorless control scheme so that appropriate tradeoffs and mitigation techniques can be devised. This will improve the position estimation and control performance while keeping the noise below a certain level. Genetic algorithms were used for including the noise optimization parameters into the developed control algorithm.^ A novel wavelet based filtering approach was proposed in this dissertation for the sensorless control algorithm at low speed. This novel filter was capable of extracting the position information at low values of injection voltage where conventional filters fail. This filtering approach can be used in practice to reduce the injected voltage in sensorless control algorithm resulting in significant reduction of noise and vibration.^ Online optimization of sensorless position estimation algorithm was performed to reduce vibration and to improve the position estimation performance. The results obtained are important and represent original contributions that can be helpful in choosing optimal parameters for sensorless control algorithm in many practical applications.^
Resumo:
INTRODUCTION: Humanized and quality prenatal and post-partum care is critical to maternal and newborn health, as well as oral health care. Currently, the National Oral Health Policy is aiming at expanding dental care for pregnant women. Thus, the promotion of oral health and attention to prenatal care policies should be integrated; however, there is still limited participation of pregnant women. Thus, it is necessary to verify the knowledge of pregnant women related to oral health, seeking to estimate the quality of dental care provided during prenatal care, being essential for the Family Health strategy to organize personnel, plan costs and to ensure the quality standard of care. OBJECTIVE: To develop and validate a research instrument on the knowledge of pregnant women about their oral health and of their baby. METHOD: This is a construction and validation study with 93 pregnant women in Family Health Units and specialized private clinics in Obstetrics, in the city of Natal / RN. It was authorized by the Onofre Lopes University Hospital Ethics Committee of the Universidade Federal do Rio Grande do Norte (UFRN) under the registration number 421.163/13. The construction of the instrument followed steps so that it was valid, reliable and sensitive: creation and reduction of the items (drafting of the instrument), content validity and testing of the instrument, and hypotheses validation. Once constructed, the instrument was evaluated by experts who suggested modifications. There was consultation with the target population about the new version of the created instrument, which had the instrument validation verified by internal consistency through intra and inter-calibration and test-retest. Next, the hypotheses were validated. A database was built in the Statistical Package for Social Sciences (SPSS), version 22.0. After creating the hypotheses, an association was found for validating the criteria between each of the specific issues for each established criteria, considering a 5% significance level. Data analysis was carried out by describing the absolute and relative frequencies of the variables pertaining to issues relating to their pregnancy knowledge about their oral health and their baby. The Kappa coefficient was used for the calibration process (Inter and Intra-examiner calibration) and Cronbach's alpha coefficient was used to analyze instrument reproducibility (test-retest). In addition, the chi-square test was used to cross the dependent variable with the (dichotomized) independent variables. RESULTS: The intra and inter agreement analysis presented a Kappa coefficient between 0.400 and 1.000. Internal consistency through the analysis showed that 90% of the instrument's questions showed great reliability in the answers (Cronbach α ˃ 0.7). In the investigation of the relationship between the dependent variable (knowledge about oral health) and the independent variables (trimester of pregnancy, education, income and multiparous), it was found that none of these independent variables were significantly associated. All hypotheses had their Ho confirmed. CONCLUSION: The constructed instrument was validated, considering that it showed to be sensitive with good reliability and good accuracy, and therefore can be used to assess pregnant women’s knowledge about their oral health and the oral health of their baby.
Resumo:
INTRODUCTION: Humanized and quality prenatal and post-partum care is critical to maternal and newborn health, as well as oral health care. Currently, the National Oral Health Policy is aiming at expanding dental care for pregnant women. Thus, the promotion of oral health and attention to prenatal care policies should be integrated; however, there is still limited participation of pregnant women. Thus, it is necessary to verify the knowledge of pregnant women related to oral health, seeking to estimate the quality of dental care provided during prenatal care, being essential for the Family Health strategy to organize personnel, plan costs and to ensure the quality standard of care. OBJECTIVE: To develop and validate a research instrument on the knowledge of pregnant women about their oral health and of their baby. METHOD: This is a construction and validation study with 93 pregnant women in Family Health Units and specialized private clinics in Obstetrics, in the city of Natal / RN. It was authorized by the Onofre Lopes University Hospital Ethics Committee of the Universidade Federal do Rio Grande do Norte (UFRN) under the registration number 421.163/13. The construction of the instrument followed steps so that it was valid, reliable and sensitive: creation and reduction of the items (drafting of the instrument), content validity and testing of the instrument, and hypotheses validation. Once constructed, the instrument was evaluated by experts who suggested modifications. There was consultation with the target population about the new version of the created instrument, which had the instrument validation verified by internal consistency through intra and inter-calibration and test-retest. Next, the hypotheses were validated. A database was built in the Statistical Package for Social Sciences (SPSS), version 22.0. After creating the hypotheses, an association was found for validating the criteria between each of the specific issues for each established criteria, considering a 5% significance level. Data analysis was carried out by describing the absolute and relative frequencies of the variables pertaining to issues relating to their pregnancy knowledge about their oral health and their baby. The Kappa coefficient was used for the calibration process (Inter and Intra-examiner calibration) and Cronbach's alpha coefficient was used to analyze instrument reproducibility (test-retest). In addition, the chi-square test was used to cross the dependent variable with the (dichotomized) independent variables. RESULTS: The intra and inter agreement analysis presented a Kappa coefficient between 0.400 and 1.000. Internal consistency through the analysis showed that 90% of the instrument's questions showed great reliability in the answers (Cronbach α ˃ 0.7). In the investigation of the relationship between the dependent variable (knowledge about oral health) and the independent variables (trimester of pregnancy, education, income and multiparous), it was found that none of these independent variables were significantly associated. All hypotheses had their Ho confirmed. CONCLUSION: The constructed instrument was validated, considering that it showed to be sensitive with good reliability and good accuracy, and therefore can be used to assess pregnant women’s knowledge about their oral health and the oral health of their baby.
Resumo:
The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.
At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.
The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.
In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.
To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.
In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.
Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.
In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.
Resumo:
I study the link between capital markets and sources of macroeconomic risk. In chapter 1 I show that expected inflation risk is priced in the cross section of stock returns even after controlling for cash flow growth and volatility risks. Motivated by this evidence I study a long run risk model with a built-in inflation non-neutrality channel that allows me to decompose the real stochastic discount factor into news about current and expected cash flow growth, news about expected inflation and news about volatility. The model can successfully price a broad menu of assets and provides a setting for analyzing cross sectional variation in expected inflation risk premium. For industries like retail and durable goods inflation risk can account for nearly a third of the overall risk premium while the energy industry and a broad commodity index act like inflation hedges. Nominal bonds are exposed to expected inflation risk and have inflation premiums that increase with bond maturity. The price of expected inflation risk was very high during the 70's and 80's, but has come down a lot since being very close to zero over the past decade. On average, the expected inflation price of risk is negative, consistent with the view that periods of high inflation represent a "bad" state of the world and are associated with low economic growth and poor stock market performance. In chapter 2 I look at the way capital markets react to predetermined macroeconomic announcements. I document significantly higher excess returns on the US stock market on macro release dates as compared to days when no macroeconomic news hit the market. Almost the entire equity premium since 1997 is being realized on days when macroeconomic news are released. At high frequency, there is a pattern of returns increasing in the hours prior to the pre-determined announcement time, peaking around the time of the announcement and dropping thereafter.
Resumo:
Evolution of the planktic foraminiferal lineage Globorotalia (Fohsella) occurred during the Miocene between 23.7 and 11.8 Ma and forms the basis for stratigraphic subdivision of the early middle Miocene (Zones N 10 through N 12). Important morphologic changes within the G. (Fohsella) lineage included a marked increase in test size, a transition from a rounded to an acute periphery, and the development of a keel in later forms. We found that the most rapid changes in morphology of G. (Fohsella) occurred between 13 and 12.7 Ma and coincided with an abrupt increase in the delta18O ratios of shell calcite. Comparison of isotopic results of G. (Fohsella) with other planktic foraminifers indicate that delta18O values of the lineage diverge from surface-dwelling species and approach deep-dwelling species after 13.0 Ma, indicating a change in depth habitat from the surface mixed layer to intermediate depth near the thermocline. Isotopic and faunal evidence suggests that this change in depth stratification was associated with an expansion of the thermocline in the western equatorial Pacific. After adapting to a deeper water habitat at 13.0 Ma, the G. (Fohsella) lineage became extinct abruptly at 11.8 Ma during a period when isotopic and faunal evidence suggest a shoaling of the thermocline. Following the extinction of G. (Fohsella), the ecologic niche of the lineage was filled by the Globorotalia (Menardella) group, which began as a deep-water form and later evolved to an intermediate-water habitat. We suggest that the evolution of G. (Fohsella) and G. (Menardella) were tightly linked to changes in the structure of the thermocline in the western equatorial Pacific.
Resumo:
This article analyses the motivations for return migration among the Ecuadorians and Bolivians who, after living in Spain, returned to their countries of origin during the economic crisis that started in 2008. From the analysis of 22 interviews in-depth which took place in Ecuador and 38 in Bolivia to women, men and young people from migrant families, this decision-making process is shown to be embedded into a gendered dynamics of relationships. Particular detail is given to affective and economic elements that had an influence on the decision to return, as well as to the strategies deployed to project their readjustment back in origin. Males and females occupy differential positions within the family, work and social circle, their expectations being built in a gendered manner. Despite the fact migration has brought women greater economic power within the family group, their reintegration upon return redefines their role as main managers in the household and the dynamics that allow their social reproduction. Men, for their part, aspire to refresh their role as providers in spite of their frail labour position upon return. Social mobility for females is passed on through generations by a strong investment on education for their daughters and sons, while for males this mobility revolves around setting up family businesses and around their demonstrative abilities.