88 resultados para user-oriented design
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
This paper presents an analysis of the performance of a baseband multiple-input single-output (MISO) time reversal ultra-wideband system (TR-UWB) incorporating a symbol spaced decision feedback equalizer (DFE). A semi-analytical performance analysis based on a Gaussian approach is considered, which matched well with simulation results, even for the DFE case. The channel model adopted is based on the IEEE 802.15.3a model, considering correlated shadowing across antenna elements. In order to provide a more realistic analysis, channel estimation errors are considered for the design of the TR filter. A guideline for the choice of equalizer length is provided. The results show that the system`s performance improves with an increase in the number of transmit antennas and when a symbol spaced equalizer is used with a relatively small number of taps compared to the number of resolvable paths in the channel impulse response. Moreover, it is possible to conclude that due to the time reversal scheme, the error propagation in the DFE does not play a role in the system`s performance.
Resumo:
We have designed, built, and tested an early prototype of a novel subxiphoid access system intended to facilitate epicardial electrophysiology, but with possible applications elsewhere in the body. The present version of the system consists of a commercially available insertion needle, a miniature pressure sensor and interconnect tubing, read-out electronics to monitor the pressures measured during the access procedure, and a host computer with user-interface software. The nominal resolution of the system is <0.1 mmHg, and it has deviations from linearity of <1%. During a pilot series of human clinical studies with this system, as well as in an auxiliary study done with an independent method, we observed that the pericardial space contained pressure-frequency components related to both the heart rate and respiratory rate, while the thorax contained components related only to the respiratory rate, a previously unobserved finding that could facilitate access to the pericardial space. We present and discuss the design principles, details of construction, and performance characteristics of this system.
Resumo:
For centuries, specific instruments or regular toothbrushes have routinely been used to remove tongue biofilm and improve breath odor. Toothbrushes with a tongue scraper on the back of their head have recently been introduced to the market. The present study compared the effectiveness of a manual toothbrush with this new design, i.e., possessing a tongue scraper, and a commercial tongue scraper in improving breath odor and reducing the aerobic and anaerobic microbiota of tongue surface. The evaluations occurred at 4 moments, when the participants (n=30) had their halitosis quantified with a halimeter and scored according to a 4-point scoring system corresponding to different levels of intensity. Saliva was collected for counts of aerobic and anaerobic microorganisms. Data were analyzed statistically by Friedman's test (p<0.05). When differences were detected, the Wilcoxon test adjusted for Bonferroni correction was used for multiple comparisons (group to group). The results confirmed the importance of mechanical cleaning of the tongue, since this procedure provided an improvement in halitosis and reduction of aerobe and anaerobe counts. Regarding the evaluated methods, the toothbrush's tongue scraper and conventional tongue scraper had a similar performance in terms of breath improvement and reduction of tongue microbiota, and may be indicated as effective methods for tongue cleaning.
Resumo:
This study evaluated the effect of specimens' design and manufacturing process on microtensile bond strength, internal stress distributions (Finite Element Analysis - FEA) and specimens' integrity by means of Scanning Electron Microscopy (SEM) and Laser Scanning Confocal Microscopy (LCM). Excite was applied to flat enamel surface and a resin composite build-ups were made incrementally with 1-mm increments of Tetric Ceram. Teeth were cut using a diamond disc or a diamond wire, obtaining 0.8 mm² stick-shaped specimens, or were shaped with a Micro Specimen Former, obtaining dumbbell-shaped specimens (n = 10). Samples were randomly selected for SEM and LCM analysis. Remaining samples underwent microtensile test, and results were analyzed with ANOVA and Tukey test. FEA dumbbell-shaped model resulted in a more homogeneous stress distribution. Nonetheless, they failed under lower bond strengths (21.83 ± 5.44 MPa)c than stick-shaped specimens (sectioned with wire: 42.93 ± 4.77 MPaª; sectioned with disc: 36.62 ± 3.63 MPa b), due to geometric irregularities related to manufacturing process, as noted in microscopic analyzes. It could be concluded that stick-shaped, nontrimmed specimens, sectioned with diamond wire, are preferred for enamel specimens as they can be prepared in a less destructive, easier, and more precise way.
Resumo:
This work describes the construction and testing of a simple pressurized solvent extraction (PSE) system. A mixture of acetone:water (80:20), 80 ºC and 103.5 bar, was used to extract two herbicides (Diuron and Bromacil) from a sample of polluted soil, followed by identification and quantification by high-performance liquid chromatography coupled with diode array detector (HPLC-DAD). The system was also used to extract soybean oil (70 ºC and 69 bar) using pentane. The extracted oil was weighed and characterized through the fatty acid methyl ester analysis (myristic (< 0.3%), palmitic (16.3%), stearic (2.8%), oleic (24.5%), linoleic (46.3%), linolenic (9.6%), araquidic (0.3%), gadoleic (< 0.3%), and behenic (0.3%) acids) using high-resolution gas chromatography with flame ionization detection (HRGC-FID). PSE results were compared with those obtained using classical procedures: Soxhlet extraction for the soybean oil and solid-liquid extraction followed by solid-phase extraction (SLE-SPE) for the herbicides. The results showed: 21.25 ± 0.36% (m/m) of oil in the soybeans using the PSE system and 21.55 ± 0.65% (m/m) using the soxhlet extraction system; extraction efficiency (recovery) of herbicides Diuron and Bromacil of 88.7 ± 4.5% and 106.6 ± 8.1%, respectively, using the PSE system, and 96.8 ± 1.0% and 94.2 ± 3.9%, respectively, with the SLP-SPE system; limit of detection (LOD) and limit of quantification (LOQ) for Diuron of 0.012 mg kg-1 and 0.040 mg kg-1, respectively; LOD and LOQ for Bromacil of 0.025 mg kg-1 and 0.083 mg kg-1, respectively. The linearity used ranged from 0.04 to 1.50 mg L-1 for Diuron and from 0.08 to 1.50 mg L-1 for Bromacil. In conclusion, using the PSE system, due to high pressure and temperature, it is possible to make efficient, fast extractions with reduced solvent consumption in an inert atmosphere, which prevents sample and analyte decomposition.
Resumo:
This paper revisits the design of L and S band bridged loop-gap resonators (BLGRs) for electron paramagnetic resonance applications. A novel configuration is described and extensively characterized for resonance frequency and quality factor as a function of the geometrical parameters of the device. The obtained experimental results indicate higher values of the quality factor (Q) than previously reported in the literature, and the experimental analysis data should provide useful guidelines for BLGR design.
Resumo:
This paper proposes a new design methodology for discrete multi-pumped Raman amplifier. In a multi-objective optimization scenario, in a first step the whole solution-space is inspected by a CW analytical formulation. Then, the most promising solutions are fully investigated by a rigorous numerical treatment and the Raman amplification performance is thus determined by the combination of analytical and numerical approaches. As an application of our methodology we designed an photonic crystal fiber Raman amplifier configuration which provides low ripple, high gain, clear eye opening and a low power penalty. The amplifier configuration also enables to fully compensate the dispersion introduced by a 70-km singlemode fiber in a 10 Gbit/s system. We have successfully obtained a configuration with 8.5 dB average gain over the C-band and 0.71 dB ripple with almost zero eye-penalty using only two pump lasers with relatively low pump power. (C) 2009 Optical Society of America
Resumo:
In this work, the effects of conical indentation variables on the load-depth indentation curves were analyzed using finite element modeling and dimensional analysis. A factorial design 2(6) was used with the aim of quantifying the effects of the mechanical properties of the indented material and of the indenter geometry. Analysis was based on the input variables Y/E, R/h(max), n, theta, E, and h(max). The dimensional variables E and h(max) were used such that each value of dimensionless Y/E was obtained with two different values of E and each value of dimensionless R/h(max) was obtained with two different h(max) values. A set of dimensionless functions was defined to analyze the effect of the input variables: Pi(1) = P(1)/Eh(2), Pi(2) = h(c)/h, Pi(3) = H/Y, Pi(4) = S/Eh(max), Pi(6) = h(max)/h(f) and Pi(7) = W(P)/W(T). These six functions were found to depend only on the dimensionless variables studied (Y/E, R/h(max), n, theta). Another dimension less function, Pi(5) = beta, was not well defined for most of the dimensionless variables and the only variable that provided a significant effect on beta was theta. However, beta showed a strong dependence on the fraction of the data selected to fit the unloading curve, which means that beta is especially Susceptible to the error in the Calculation of the initial unloading slope.
Resumo:
This paper presents a rational approach to the design of a catamaran's hydrofoil applied within a modern context of multidisciplinary optimization. The approach used includes the use of response surfaces represented by neural networks and a distributed programming environment that increases the optimization speed. A rational approach to the problem simplifies the complex optimization model; when combined with the distributed dynamic training used for the response surfaces, this model increases the efficiency of the process. The results achieved using this approach have justified this publication.
Resumo:
With the advent and development of technology, mainly in the Internet, more and more electronic services are being offered to customers in all areas of business, especially in the offering of information services, as in virtual libraries. This article proposes a new opportunity to provide services to virtual libraries customers, presenting a methodology for the implementation of electronic services oriented by these customers' life situations. Through analytical observations of some national virtual libraries sites, it could be identified that the offer of services considering life situations and relationship interest situations can promote the service to their customers, providing greater satisfaction and, consequently, improving quality in the offer of information services. The visits to those sites and the critical analysis of the data collected during these visits, supported by bibliographic researches results, have enabled the description of this methodology, concluding that the provision of services on an isolated way or in accordance with the user's profile on sites of virtual libraries is not always enough to ensure the attendance to the needs and expectations of its customers, which suggests the offering of these services considering life situations and relationship interest situations as a complement that adds value to the business of virtual library. This becomes relevant when indicates new opportunities to provide virtual libraries services with quality, serving as a guide to the information providers managers, enabling the offering of new means to access information services by such customers, looking for pro - activity and services integration, in order to solve definitely real problems.
Resumo:
Background: The MASS IV-DM Trial is a large project from a single institution, the Heart Institute (InCor), University of Sao Paulo Medical School, Brazil to study ventricular function and coronary arteries in patients with type 2 diabetes mellitus. Methods/Design: The study will enroll 600 patients with type 2 diabetes who have angiographically normal ventricular function and coronary arteries. The goal of the MASS IV-DM Trial is to achieve a long-term evaluation of the development of coronary atherosclerosis by using angiograms and coronary-artery calcium scan by electron-beam computed tomography at baseline and after 5 years of follow-up. In addition, the incidence of major cardiovascular events, the dysfunction of various organs involved in this disease, particularly microalbuminuria and renal function, will be analyzed through clinical evaluation. In addition, an effort will be made to investigate in depth the presence of major cardiovascular risk factors, especially the biochemical profile, metabolic syndrome inflammatory activity, oxidative stress, endothelial function, prothrombotic factors, and profibrinolytic and platelet activity. An evaluation will be made of the polymorphism as a determinant of disease and its possible role in the genesis of micro- and macrovascular damage. Discussion: The MASS IV-DM trial is designed to include diabetic patients with clinically suspected myocardial ischemia in whom conventional angiography shows angiographically normal coronary arteries. The result of extensive investigation including angiographic follow-up by several methods, vascular reactivity, pro-thrombotic mechanisms, genetic and biochemical studies may facilitate the understanding of so-called micro- and macrovascular disease of DM.
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.
Resumo:
We have modeled, fabricated, and characterized superhydrophobic surfaces with a morphology formed of periodic microstructures which are cavities. This surface morphology is the inverse of that generally reported in the literature when the surface is formed of pillars or protrusions, and has the advantage that when immersed in water the confined air inside the cavities tends to expel the invading water. This differs from the case of a surface morphology formed of pillars or protrusions, for which water can penetrate irreversibly among the microstructures, necessitating complete drying of the surface in order to again recover its superhydrophobic character. We have developed a theoretical model that allows calculation of the microcavity dimensions needed to obtain superhydrophobic surfaces composed of patterns of such microcavities, and that provides estimates of the advancing and receding contact angle as a function of microcavity parameters. The model predicts that the cavity aspect ratio (depth-to-diameter ratio) can be much less than unity, indicating that the microcavities do not need to be deep in order to obtain a surface with enhanced superhydrophobic character. Specific microcavity patterns have been fabricated in polydimethylsiloxane and characterized by scanning electron microscopy, atomic force microscopy, and contact angle measurements. The measured advancing and receding contact angles are in good agreement with the predictions of the model. (C) 2010 American Institute of Physics. [doi:10.1063/1.3466979]
Resumo:
We have developed a theoretical model for superhydrophobic surfaces that are formed from an extended array of microcavities, and have fabricated specific microcavity patterns to form superhydrophobic surfaces of the kind modeled. The model shows that the cavity aspect ratio can be significantly less than unity, indicating that the microcavities do not need to be deep in order to enhance the superhydrophobic character of the surface. We have fabricated surfaces of this kind and measured advancing contact angle as high as 153 degrees, in agreement with predictions of the model.