910 resultados para Software Design Pattern
Resumo:
OBJECTIVES: Proteomics approaches to cardiovascular biology and disease hold the promise of identifying specific proteins and peptides or modification thereof to assist in the identification of novel biomarkers. METHOD: By using surface-enhanced laser desorption and ionization time of flight mass spectroscopy (SELDI-TOF-MS) serum peptide and protein patterns were detected enabling to discriminate between postmenopausal women with and without hormone replacement therapy (HRT). RESULTS: Serum of 13 HRT and 27 control subjects was analyzed and 42 peptides and proteins could be tentatively identified based on their molecular weight and binding characteristics on the chip surface. By using decision tree-based Biomarker Patternstrade mark Software classification and regression analysis a discriminatory function was developed allowing to distinguish between HRT women and controls correctly and, thus, yielding a sensitivity of 100% and a specificity of 100%. The results show that peptide and protein patterns have the potential to deliver novel biomarkers as well as pinpointing targets for improved treatment. The biomarkers obtained represent a promising tool to discriminate between HRT users and non-users. CONCLUSION: According to a tentative identification of the markers by their molecular weight and binding characteristics, most of them appear to be part of the inflammation induced acute-phase response
Resumo:
An experiment was conducted to determine the effect of grazing versus zero-grazing on energy expenditure (EE), feeding behaviour and physical activity in dairy cows at different stages of lactation. Fourteen Holstein cows were subjected to two treatments in a repeated crossover design with three experimental series (S1, S2, and S3) reflecting increased days in milk (DIM). At the beginning of each series, cows were on average at 38, 94 and 171 (standard deviation (SD) 10.8) DIM, respectively. Each series consisted of two periods containing a 7-d adaptation and a 7-d collection period each. Cows either grazed on pasture for 16–18.5 h per day or were kept in a freestall barn and had ad libitum access to herbage harvested from the same paddock. Herbage intake was estimated using the double alkane technique. On each day of the collection period, EE of one cow in the barn and of one cow on pasture was determined for 6 h by using the 13C bicarbonate dilution technique, with blood sample collection done either manually in the barn or using an automatic sampling system on pasture. Furthermore, during each collection period physical activity and feeding behaviour of cows were recorded over 3 d using pedometers and behaviour recorders. Milk yield decreased with increasing DIM (P<0.001) but was similar with both treatments. Herbage intake was lower (P<0.01) for grazing cows (16.8 kg dry matter (DM)/d) compared to zero-grazing cows (18.9 kg DM/d). The lowest (P<0.001) intake was observed in S1 and similar intakes were observed in S2 and S3. Within the 6-h measurement period, grazing cows expended 19% more (P<0.001) energy (319 versus 269 kJ/kg metabolic body size (BW0.75)) than zero-grazing cows and differences in EE did not change with increasing DIM. Grazing cows spent proportionally more (P<0.001) time walking and less time standing (P<0.001) and lying (P<0.05) than zero-grazing cows. The proportion of time spent eating was greater (P<0.001) and that of time spent ruminating was lower (P<0.05) for grazing cows compared to zero-grazing cows. In conclusion, lower feed intake along with the unchanged milk production indicates that grazing cows mobilized body reserves to cover additional energy requirements which were at least partly caused by more physical activity. However, changes in cows׳ behaviour between the considered time points during lactation were too small so that differences in EE remained similar between treatments with increasing DIM.
Resumo:
Software architecture is the result of a design effort aimed at ensuring a certain set of quality attributes. As we show, quality requirements are commonly specified in practice but are rarely validated using automated techniques. In this paper we analyze and classify commonly specified quality requirements after interviewing professionals and running a survey. We report on tools used to validate those requirements and comment on the obstacles encountered by practitioners when performing such activity (e.g., insufficient tool-support; poor understanding of users needs). Finally we discuss opportunities for increasing the adoption of automated tools based on the information we collected during our study (e.g., using a business-readable notation for expressing quality requirements; increasing awareness by monitoring non-functional aspects of a system).
Resumo:
Plant‐mediated interactions between herbivores are important determinants of community structure and plant performance in natural and agricultural systems. Current research suggests that the outcome of the interactions is determined by herbivore and plant identity, which may result in stochastic patterns that impede adaptive evolution and agricultural exploitation. However, few studies have systemically investigated specificity versus general patterns in a given plant system by varying the identity of all involved players. We investigated the influence of herbivore identity and plant genotype on the interaction between leaf‐chewing and root‐feeding herbivores in maize using a partial factorial design. We assessed the influence of leaf induction by oral secretions of six different chewing herbivores on the response of nine different maize genotypes and three different root feeders. Contrary to our expectations, we found a highly conserved pattern across all three dimensions of specificity: The majority of leaf herbivores elicited a negative behavioral response from the different root feeders in the large majority of tested plant genotypes. No facilitation was observed in any of the treatment combinations. However, the oral secretions of one leaf feeder and the responses of two maize genotypes did not elicit a response from a root‐feeding herbivore. Together, these results suggest that plant‐mediated interactions in the investigated system follow a general pattern, but that a degree of specificity is nevertheless present. Our study shows that within a given plant species, plant‐mediated interactions between herbivores of the same feeding guild can be stable. This stability opens up the possibility of adaptations by associated organisms and suggests that plant‐mediated interactions may contribute more strongly to evolutionary dynamics in terrestrial (agro)ecosystems than previously assumed.
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
Conventional designs of animal bioassays allocate the same number of animals into control and dose groups to explore the spontaneous and induced tumor incidence rates, respectively. The purpose of such bioassays are (a) to determine whether or not the substance exhibits carcinogenic properties, and (b) if so, to estimate the human response at relatively low doses. In this study, it has been found that the optimal allocation to the experimental groups which, in some sense, minimize the error of the estimated response for low dose extrapolation is associated with the dose level and tumor risk. The number of dose levels has been investigated at the affordable experimental cost. The pattern of the administered dose, 1 MTD, 1/2 MTD, 1/4 MTD,....., etc. plus control, gives the most reasonable arrangement for the low dose extrapolation purpose. The arrangement of five dose groups may make the highest dose trivial. A four-dose design can circumvent this problem and has also one degree of freedom for testing the goodness-of-fit of the response model.^ An example using the data on liver tumors induced in mice in a lifetime study of feeding dieldrin (Walker et al., 1973) is implemented with the methodology. The results are compared with conclusions drawn from other studies. ^
Resumo:
The purpose of this study was to design, synthesize and develop novel transporter targeting agents for image-guided therapy and drug delivery. Two novel agents, N4-guanine (N4amG) and glycopeptide (GP) were synthesized for tumor cell proliferation assessment and cancer theranostic platform, respectively. N4amG and GP were synthesized and radiolabeled with 99mTc and 68Ga. The chemical and radiochemical purities as well as radiochemical stabilities of radiolabeled N4amG and GP were tested. In vitro stability assessment showed both 99mTc-N4amG and 99mTc-GP were stable up to 6 hours, whereas 68Ga-GP was stable up to 2 hours. Cell culture studies confirmed radiolabeled N4amG and GP could penetrate the cell membrane through nucleoside transporters and amino acid transporters, respectively. Up to 40% of intracellular 99mTc-N4amG and 99mTc-GP was found within cell nucleus following 2 hours of incubation. Flow cytometry analysis revealed 99mTc-N4amG was a cell cycle S phase-specific agent. There was a significant difference of the uptake of 99mTc-GP between pre- and post- paclitaxel-treated cells, which suggests that 99mTc-GP may be useful in chemotherapy treatment monitoring. Moreover, radiolabeled N4amG and GP were tested in vivo using tumor-bearing animal models. 99mTc-N4amG showed an increase in tumor-to-muscle count density ratios up to 5 at 4 hour imaging. Both 99mTc-labeled agents showed decreased tumor uptake after paclitaxel treatment. Immunohistochemistry analysis demonstrated that the uptake of 99mTc-N4amG was correlated with Ki-67 expression. Both 99mTc-N4amG and 99mTc-GP could differentiate between tumor and inflammation in animal studies. Furthermore, 68Ga-GP was compared to 18F-FDG in rabbit PET imaging studies. 68Ga-GP had lower tumor standardized uptake values (SUV), but similar uptake dynamics, and different biodistribution compared with 18F-FDG. Finally, to demonstrate that GP can be a potential drug carrier for cancer theranostics, several drugs, including doxorubicin, were selected to be conjugated to GP. Imaging studies demonstrated that tumor uptake of GP-drug conjugates was increased as a function of time. GP-doxorubicin (GP-DOX) showed a slow-release pattern in in vitro cytotoxicity assay and exhibited anti-cancer efficacy with reduced toxicity in in vivo tumor growth delay study. In conclusion, both N4amG and GP are transporter-based targeting agents. Radiolabeled N4amG can be used for tumor cell proliferation assessment. GP is a potential agent for image-guided therapy and drug delivery.
Resumo:
Se aborda la construcción de repositorios institucionales open source con Software Greenstone. Se realiza un recorrido teórico y otro modélico desarrollando en él una aplicación práctica. El primer recorrido, que constituye el marco teórico, comprende una descripción, de: la filosofía open access (acceso abierto) y open source (código abierto) para la creación de repositorios institucionales. También abarca en líneas generales las temáticas relacionadas al protocolo OAI, el marco legal en lo que hace a la propiedad intelectual, las licencias y una aproximación a los metadatos. En el mismo recorrido se abordan aspectos teóricos de los repositorios institucionales: acepciones, beneficios, tipos, componentes intervinientes, herramientas open source para la creación de repositorios, descripción de las herramientas y finalmente, la descripción ampliada del Software Greenstone; elegido para el desarrollo modélico del repositorio institucional colocado en un demostrativo digital. El segundo recorrido, correspondiente al desarrollo modélico, incluye por un lado el modelo en sí del repositorio con el Software Greenstone; detallándose aquí uno a uno los componentes que lo conforman. Es el insumo teórico-práctico para el diseño -paso a paso- del repositorio institucional. Por otro lado, se incluye el resultado de la modelización, es decir el repositorio creado, el cual es exportado en entorno web a un soporte digital para su visibilización. El diseño del repositorio, paso a paso, constituye el núcleo sustantivo de aportes de este trabajo de tesina
Resumo:
Se aborda la construcción de repositorios institucionales open source con Software Greenstone. Se realiza un recorrido teórico y otro modélico desarrollando en él una aplicación práctica. El primer recorrido, que constituye el marco teórico, comprende una descripción, de: la filosofía open access (acceso abierto) y open source (código abierto) para la creación de repositorios institucionales. También abarca en líneas generales las temáticas relacionadas al protocolo OAI, el marco legal en lo que hace a la propiedad intelectual, las licencias y una aproximación a los metadatos. En el mismo recorrido se abordan aspectos teóricos de los repositorios institucionales: acepciones, beneficios, tipos, componentes intervinientes, herramientas open source para la creación de repositorios, descripción de las herramientas y finalmente, la descripción ampliada del Software Greenstone; elegido para el desarrollo modélico del repositorio institucional colocado en un demostrativo digital. El segundo recorrido, correspondiente al desarrollo modélico, incluye por un lado el modelo en sí del repositorio con el Software Greenstone; detallándose aquí uno a uno los componentes que lo conforman. Es el insumo teórico-práctico para el diseño -paso a paso- del repositorio institucional. Por otro lado, se incluye el resultado de la modelización, es decir el repositorio creado, el cual es exportado en entorno web a un soporte digital para su visibilización. El diseño del repositorio, paso a paso, constituye el núcleo sustantivo de aportes de este trabajo de tesina
Resumo:
This thesis contributes to the analysis and design of printed reflectarray antennas. The main part of the work is focused on the analysis of dual offset antennas comprising two reflectarray surfaces, one of them acts as sub-reflector and the second one acts as mainreflector. These configurations introduce additional complexity in several aspects respect to conventional dual offset reflectors, however they present a lot of degrees of freedom that can be used to improve the electrical performance of the antenna. The thesis is organized in four parts: the development of an analysis technique for dualreflectarray antennas, a preliminary validation of such methodology using equivalent reflector systems as reference antennas, a more rigorous validation of the software tool by manufacturing and testing a dual-reflectarray antenna demonstrator and the practical design of dual-reflectarray systems for some applications that show the potential of these kind of configurations to scan the beam and to generate contoured beams. In the first part, a general tool has been implemented to analyze high gain antennas which are constructed of two flat reflectarray structures. The classic reflectarray analysis based on MoM under local periodicity assumption is used for both sub and main reflectarrays, taking into account the incident angle on each reflectarray element. The incident field on the main reflectarray is computed taking into account the field radiated by all the elements on the sub-reflectarray.. Two approaches have been developed, one which employs a simple approximation to reduce the computer run time, and the other which does not, but offers in many cases, improved accuracy. The approximation is based on computing the reflected field on each element on the main reflectarray only once for all the fields radiated by the sub-reflectarray elements, assuming that the response will be the same because the only difference is a small variation on the angle of incidence. This approximation is very accurate when the reflectarray elements on the main reflectarray show a relatively small sensitivity to the angle of incidence. An extension of the analysis technique has been implemented to study dual-reflectarray antennas comprising a main reflectarray printed on a parabolic surface, or in general in a curved surface. In many applications of dual-reflectarray configurations, the reflectarray elements are in the near field of the feed-horn. To consider the near field radiated by the horn, the incident field on each reflectarray element is computed using a spherical mode expansion. In this region, the angles of incidence are moderately wide, and they are considered in the analysis of the reflectarray to better calculate the actual incident field on the sub-reflectarray elements. This technique increases the accuracy for the prediction of co- and cross-polar patterns and antenna gain respect to the case of using ideal feed models. In the second part, as a preliminary validation, the proposed analysis method has been used to design a dual-reflectarray antenna that emulates previous dual-reflector antennas in Ku and W-bands including a reflectarray as subreflector. The results for the dualreflectarray antenna compare very well with those of the parabolic reflector and reflectarray subreflector; radiation patterns, antenna gain and efficiency are practically the same when the main parabolic reflector is substituted by a flat reflectarray. The results show that the gain is only reduced by a few tenths of a dB as a result of the ohmic losses in the reflectarray. The phase adjustment on two surfaces provided by the dual-reflectarray configuration can be used to improve the antenna performance in some applications requiring multiple beams, beam scanning or shaped beams. Third, a very challenging dual-reflectarray antenna demonstrator has been designed, manufactured and tested for a more rigorous validation of the analysis technique presented. The proposed antenna configuration has the feed, the sub-reflectarray and the main-reflectarray in the near field one to each other, so that the conventional far field approximations are not suitable for the analysis of such antenna. This geometry is used as benchmarking for the proposed analysis tool in very stringent conditions. Some aspects of the proposed analysis technique that allow improving the accuracy of the analysis are also discussed. These improvements include a novel method to reduce the inherent cross polarization which is introduced mainly from grounded patch arrays. It has been checked that cross polarization in offset reflectarrays can be significantly reduced by properly adjusting the patch dimensions in the reflectarray in order to produce an overall cancellation of the cross-polarization. The dimensions of the patches are adjusted in order not only to provide the required phase-distribution to shape the beam, but also to exploit the crosses by zero of the cross-polarization components. The last part of the thesis deals with direct applications of the technique described. The technique presented is directly applicable to the design of contoured beam antennas for DBS applications, where the requirements of cross-polarisation are very stringent. The beam shaping is achieved by synthesithing the phase distribution on the main reflectarray while the sub-reflectarray emulates an equivalent hyperbolic subreflector. Dual-reflectarray antennas present also the ability to scan the beam over small angles about boresight. Two possible architectures for a Ku-band antenna are also described based on a dual planar reflectarray configuration that provides electronic beam scanning in a limited angular range. In the first architecture, the beam scanning is achieved by introducing a phase-control in the elements of the sub-reflectarray and the mainreflectarray is passive. A second alternative is also studied, in which the beam scanning is produced using 1-bit control on the main reflectarray, while a passive subreflectarray is designed to provide a large focal distance within a compact configuration. The system aims to develop a solution for bi-directional satellite links for emergency communications. In both proposed architectures, the objective is to provide a compact optics and simplicity to be folded and deployed.
Resumo:
Six-port network is an interesting radiofrequency architecture with multiple possibilities. Since it was firstly introduced in the seventies as an alternative network analyzer, the six-port network has been used for many applications, such as homodyne receivers, radar systems, direction of arrival estimation, UWB (Ultra-Wide-Band), or MIMO (Multiple Input Multiple Output) systems. Currently, it is considered as a one of the best candidates to implement a Software Defined Radio (SDR). This thesis comprises an exhaustive study of this promising architecture, where its fundamentals and the state-of-the-art are also included. In addition, the design and development of a SDR 0.3-6 GHz six-port receiver prototype is presented in this thesis, which is implemented in conventional technology. The system is experimentally characterized and validated for RF signal demodulation with good performance. The analysis of the six-port architecture is complemented by a theoretical and experimental comparison with other radiofrequency architectures suitable for SDR. Some novel contributions are introduced in the present thesis. Such novelties are in the direction of the highly topical issues on six-port technique: development and optimization of real-time I-Q regeneration techniques for multiport networks; and search of new techniques and technologies to contribute to the miniaturization of the six-port architecture. In particular, the novel contributions of this thesis can be summarized as: - Introduction of a new real-time auto-calibration method for multiport receivers, particularly suitable for broadband designs and high data rate applications. - Introduction of a new direct baseband I-Q regeneration technique for five-port receivers. - Contribution to the miniaturization of six-port receivers by the use of the multilayer LTCC (Low Temperature Cofired Ceramic) technology. Implementation of a compact (30x30x1.25 mm) broadband (0.3-6 GHz) six-port receiver in LTTC technology. The results and conclusions derived from this thesis have been satisfactory, and quite fruitful in terms of publications. A total of fourteen works have been published, considering international journals and conferences, and national conferences. Aditionally, a paper has been submitted to an internationally recognized journal, which is currently under review.
Resumo:
Embedded systems are commonly designed by specifying and developing hardware and software systems separately. On the contrary, the hardware/software (HW/SW) co-development exploits the trade-offs between hardware and software in a system through their concurrent design. HW/SW Codevelopment techniques take advantage of the flexibility of system design to create architectures that can meet stringent performance requirements with a shorter design cycle. This paper presents the work done within the scope of ESA HWSWCO (Hardware-Software Co-design) study. The main objective of this study has been to address the HW/SW co-design phase to integrate this engineering task as part of the ASSERT process (refer to [1]) and compatible with the existing ASSERT approach, process and tool, Advances in the automation of the design of HW and SW and the adoption of the Model Driven Architecture (MDA) [9] paradigm make possible the definition of a proper integration substrate and enables the continuous interaction of the HW and SW design paths.
Resumo:
Manufacturing technologies as injection molding or embossing specify their production limits for minimum radii of the vertices or draft angle for demolding, for instance. These restrictions may limit the system optical efficiency or affect the generation of undesired artifacts on the illumination pattern when dealing with optical design. A novel manufacturing concept is presented here, in which the optical surfaces are not obtained from the usual revolution symmetry with respect to a central axis (z axis), but they are calculated as free-form surfaces describing a spiral trajectory around z axis. The main advantage of this new concept lies in the manufacturing process: a molded piece can be easily separated from its mold just by applying a combination of rotational movement around axis z and linear movement along axis z, even for negative draft angles. The general designing procedure will be described in detail
Resumo:
The European Higher Education Area (EHEA) has leaded to a change in the way the subjects are taught. One of the more important aspects of the EHEA is to support the autonomous study of the students. Taking into account this new approach, the virtual laboratory of the subject Mechanisms of the Aeronautical studies at the Technical University of Madrid is being migrated to an on-line scheme. This virtual laboratory consist on two practices: the design of cam-follower mechanisms and the design of trains of gears. Both practices are software applications that, in the current situation, need to be installed on each computer and the students carry out the practice at the computer classroom of the school under the supervision of a teacher. During this year the design of cam-follower mechanisms practice has been moved to a web application using Java and the Google Development Toolkit. In this practice the students has to design and study the running of a cam to perform a specific displacement diagram with a selected follower taking into account that the mechanism must be able to work properly at high speed regime. The practice has maintained its objectives in the new platform but to take advantage of the new methodology and try to avoid the inconveniences that the previous version had shown. Once the new practice has been ready, a pilot study has been carried out to compare both approaches: on-line and in-lab. This paper shows the adaptation of the cam and follower practice to an on-line methodology. Both practices are described and the changes that has been done to the initial one are shown. They are compared and the weak and strong points of each one are analyzed. Finally we explain the pilot study carried out, the students impression and the results obtained.
Resumo:
The European Higher Education Area (EHEA) has leaded to a change in the way the subjects are taught. One of the more important aspects of the EHEA is to support the autonomous study of the students. Taking into account this new approach, the virtual laboratory of the subject Mechanisms of the Aeronautical studies at the Technical University of Madrid is being migrated to an on-line scheme. This virtual laboratory consist on two practices: the design of cam-follower mechanisms and the design of trains of gears. Both practices are software applications that, in the current situation, need to be installed on each computer and the students carry out the practice at the computer classroom of the school under the supervision of a teacher. During this year the design of cam-follower mechanisms practice has been moved to a web application using Java and the Google Development Toolkit. In this practice the students has to design and study the running of a cam to perform a specific displacement diagram with a selected follower taking into account that the mechanism must be able to work properly at high speed regime. The practice has maintained its objectives in the new platform but to take advantage of the new methodology and try to avoid the inconveniences that the previous version had shown. Once the new practice has been ready, a pilot study has been carried out to compare both approaches: on-line and in-lab. This paper shows the adaptation of the cam and follower practice to an on-line methodology. Both practices are described and the changes that has been done to the initial one are shown. They are compared and the weak and strong points of each one are analyzed. Finally we explain the pilot study carried out, the students impression and the results obtained.