992 resultados para dual value


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel fiber Bragg grating (FBG) sensor system based on an interrogating technique by two parallel matched gratings was designed and theoretically discussed. With an interrogation grating playing the role of temperature compensation grating simultaneously, the wavelength drifts induced by temperature and strain were discriminated. Additionally, the expressions of temperature and strain were deduced for our solution, and dual-value problem and cross sensitivity were solved synchronously through data processing. The influence of the FBG's parameters on the dynamic range and precision was discussed. Besides, the change of environment temperature cannot influence the dynamic range of the sensor system through temperature tuning. The system proposed in this paper will be of great significance to accelerate the real engineering applications of FBG sensing techniques. (c) 2007 Elsevier GmbH. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE The objective of this study was to assess the discriminative power of dual-energy computed tomography (DECT) versus single-energy CT (SECT) to distinguish between ferromagnetic and non-ferromagnetic ballistic projectiles to improve safety regarding magnetic resonance (MR) imaging studies in patients with retained projectiles. MATERIALS AND METHODS Twenty-seven ballistic projectiles including 25 bullets (diameter, 3-15 mm) and 2 shotgun pellets (2 mm each) were examined in an anthropomorphic chest phantom using 128-section dual-source CT. Data acquisition was performed with tube voltages set at 80, 100, 120, and 140 kV(p). Two readers independently assessed CT numbers of the projectile's core on images reconstructed with an extended CT scale. Dual-energy indices (DEIs) were calculated from both 80-/140-kV(p) and 100-/140-kV(p) pairs; receiver operating characteristics curves were fitted to assess ferromagnetic properties by means of CT numbers and DEI. RESULTS Nine (33%) of the projectiles were ferromagnetic; 18 were nonferromagnetic (67%). Interreader and intrareader correlations of CT number measurements were excellent (intraclass correlation coefficients, >0.906; P<0.001). The DEI calculated from both 80/140 and 100/140 kV(p) were significantly (P<0.05) different between the ferromagnetic and non-ferromagnetic projectiles. The area under the curve (AUC) was 0.75 and 0.8 for the tube voltage pairs of 80/140 and 100/140 kV(p) (P<0.05; 95% confidence interval, 0.57-0.94 and 0.62-0.97, respectively) to differentiate between the ferromagnetic and non-ferromagnetic ballistic projectiles; which increased to 0.83 and 0.85 when shotgun pellets were excluded from the analysis. The AUC for SECT was 0.69 and 0.73 (80 and 100 kV[p], respectively). CONCLUSIONS Measurements of DECT combined with an extended CT scale allow for the discrimination of projectiles with non-ferromagnetic from those with ferromagnetic properties in an anthropomorphic chest phantom with a higher AUC compared with SECT. This study indicates that DECT may have the potential to contribute to MR safety and allow for MR imaging of patients with retained projectiles. However, further studies are necessary before this concept may be used to triage clinical patients before MR.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Multiple scores have been proposed to stratify bleeding risk, but their value to guide dual antiplatelet therapy duration has never been appraised. We compared the performance of the CRUSADE (Can Rapid Risk Stratification of Unstable Angina Patients Suppress Adverse Outcomes With Early Implementation of the ACC/AHA Guidelines), ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy), and HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) scores in 1946 patients recruited in the Prolonging Dual Antiplatelet Treatment After Grading Stent-Induced Intimal Hyperplasia Study (PRODIGY) and assessed hemorrhagic and ischemic events in the 24- and 6-month dual antiplatelet therapy groups. METHODS AND RESULTS Bleeding score performance was assessed with a Cox regression model and C statistics. Discriminative and reclassification power was assessed with net reclassification improvement and integrated discrimination improvement. The C statistic was similar between the CRUSADE score (area under the curve 0.71) and ACUITY (area under the curve 0.68), and higher than HAS-BLED (area under the curve 0.63). CRUSADE, but not ACUITY, improved reclassification (net reclassification index 0.39, P=0.005) and discrimination (integrated discrimination improvement index 0.0083, P=0.021) of major bleeding compared with HAS-BLED. Major bleeding and transfusions were higher in the 24- versus 6-month dual antiplatelet therapy groups in patients with a CRUSADE score >40 (hazard ratio for bleeding 2.69, P=0.035; hazard ratio for transfusions 4.65, P=0.009) but not in those with CRUSADE score ≤40 (hazard ratio for bleeding 1.50, P=0.25; hazard ratio for transfusions 1.37, P=0.44), with positive interaction (Pint=0.05 and Pint=0.01, respectively). The number of patients with high CRUSADE scores needed to treat for harm for major bleeding and transfusion were 17 and 15, respectively, with 24-month rather than 6-month dual antiplatelet therapy; corresponding figures in the overall population were 67 and 71, respectively. CONCLUSIONS Our analysis suggests that the CRUSADE score predicts major bleeding similarly to ACUITY and better than HAS BLED in an all-comer population with percutaneous coronary intervention and potentially identifies patients at higher risk of hemorrhagic complications when treated with a long-term dual antiplatelet therapy regimen. CLINICAL TRIAL REGISTRATION URL: http://clinicaltrials.gov. Unique identifier: NCT00611286.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relentless CMOS scaling coupled with lower design tolerances is making ICs increasingly susceptible to wear-out related permanent faults and transient faults, necessitating on-chip fault tolerance in future chip microprocessors (CMPs). In this paper we introduce a new energy-efficient fault-tolerant CMP architecture known as Redundant Execution using Critical Value Forwarding (RECVF). RECVF is based on two observations: (i) forwarding critical instruction results from the leading to the trailing core enables the latter to execute faster, and (ii) this speedup can be exploited to reduce energy consumption by operating the trailing core at a lower voltage-frequency level. Our evaluation shows that RECVF consumes 37% less energy than conventional dual modular redundant (DMR) execution of a program. It consumes only 1.26 times the energy of a non-fault-tolerant baseline and has a performance overhead of just 1.2%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An empirical study is made on the fatigue crack growth rate in ferrite-martensite dual-phase (FMDP) steel. Particular attention is given to the effect of ferrite content in the range of 24.2% to 41.5% where good fatigue resistance was found at 33.8%. Variations in ferrite content did not affect the crack growth rate View the MathML sourcewhen plotted against the effective stress intensity factor range View the MathML source which was assumed to follow a linear relation with the crack tip stress intensity factor range ΔK. A high View the MathML source corresponds to uniformly distributed small size ferrite and martensite. No other appreciable correlation could be ralated to the microstructure morphology of the FMDP steel. The closure stress intensity factor View the MathML source, however, is affected by the ferrite content with View the MathML source reaching a maximum value of 0.7. In general, crack growth followed the interphase between the martensite and ferrite.

Dividing the fatigue crack growth process into Stage I and II where the former would be highly sensitive to changes in ΔK and the latter would increase with ΔK depending on the View the MathML source ratio. The same data when correlated with the strain energy density factor range ΔS showed negligible dependence on mean stress or R ratio for Stage I crack growth. A parameter α involving the ratio of ultimate stress to yield stress, percent reduction of area and R is introduced for Stage II crack growth so that the View the MathML source data for different R would collapse onto a single curve with a narrow scatter band when plotted against αΔS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The key issues of engineering application of the dual gratings parallel matched interrogation method are expanding the measurable range, improving the usability, and lowering the cost by adopting a compact and simple setup based on existing conditions and improving the precision of the data-processing scheme. A credible and effective data-processing scheme based on a novel divisional look-up table is proposed based on the advantages of other schemes. Any undetermined data is belonged to a certain section, which can be confirmed at first, then it can be looked up in the table to correspond to microstrain by the scheme. It not only solves inherent problems of the traditional one (double value and small measurable range) but also enhances the precision, which improves the performance of the system. From the experimental results, the measurable range of the system is 525 mu epsilon, and the precision is +/- 1 mu epsilon based on normal matched gratings. The system works in real time, which is competent for most engineering measurement requirements. (C) 2007 Elsevier GmbH. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein adsorption plays a crucial role in biomaterial surface science as it is directly linked to the biocompatibility of artificial biomaterial devices. Here, elucidation of protein adsorption mechanism is effected using dual polarization interferometry and a quartz crystal microbalance to characterize lysozyme layer properties on a silica surface at different coverage values. Lysozyme is observed to adsorb from sparse monolayer to multilayer coverage. At low coverage an irreversibly adsorbed layer is formed with slight deformation consistent with side-on orientation. At higher coverage values dynamic re-orientation effects are observed which lead to monolayer surface coverages of 2-3 ng/mm² corresponding to edge-on or/and end-on orientations. These monolayer thickness values ranged between 3 and 4.5 nm with a protein density value of 0.60 g/mL and with 50 wt% solvent mass. Further increase of coverage results formation of a multilayer structure. Using the hydration content and other physical layer properties a tentative model lysozyme adsorption is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein adsorption plays a crucial role in biomaterial surface science as it is directly linked to the biocompatibility of artificial biomaterial devices. Here, elucidation of protein adsorption mechanism is effected using dual polarization interferometry and a quartz crystal microbalance to characterize lysozyme layer properties on a silica surface at different coverage values. Lysozyme is observed to adsorb from sparse monolayer to multilayer coverage. At low coverage an irreversibly adsorbed layer is formed with slight deformation consistent with side-on orientation. At higher coverage values dynamic re-orientation effects are observed which lead to monolayer surface coverages of 2-3 ng/mm2 corresponding to edge-on or/and end-on orientations. These monolayer thickness values ranged between 3 and 4.5 nm with a protein density value of 0.60 g/mL and with 50 wt% solvent mass. Further increase of coverage results formation of a multilayer structure. Using the hydration content and other physical layer properties a tentative model lysozyme adsorption is proposed. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A kind of ultra-narrow dual-channel filter is proposed in principle and demonstrated experimentally. This filter is designed by means of two sampled fibre Bragg gratings (SFBGs), where one is periodic 0-pi sampling and the other is symmetrical spatial sampling. The former can create two stopbands in the transmission spectra and the latter can produce two ultra-riarrow passbands. Our filter has the 3-dB bandwidth of about 1 pm, whose value is two orders of magnitude less than the bandwidth of the traditional SFBG filters. The proposed filter has a merit that the channel spacing remains unchanged when tuning the filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the mega-, micro-sporogenesis and female-, male-gametogenesis of Swertia cincta for the first time, with the aim of discussing the systematic position of section Platynema and section Ophelia of Swertia. Anthers are tetrasporangiate. The development of anther walls conforms to the dicotyledonous type. The tapetum cells have dual origin and are similar to the glandular type. There are two middle layers. The endothecium and epidermis persist. Cytokinesis in the microsporrocyte meiosis is simultaneous type and the microscpore teads are tetrahedral. Pollen grains are 3-celled. The ovary is bicarpellum and unilocular. The placentation is of suparietal placentation with 12 series of ovules. The ovules. The ovule is unitegmic, tenuinucellar and ana-campylotropous, The embryo sac orignates from the single-archesporial cell. The one chalazal megaspore in lienar tetrad become the functional megasore. The development of embryo sac is of the polygonum type. Before fertilization, two polar nuclei fuse into one secondary nucleus. Three antipodal cells persisted and divided into 5-8 cells. A comparison between two sections indicates that section Plathnema is better treated as distinct section and is more advanced than section Ophelia according to the evolutionary trends of embryological characters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of an extreme-ultraviolet (XUV) continuum light source and its application to a dual-laser plasma (DLP) photoabsorption experiment are described. The continuum emitting plasma was formed by focusing a 7 ps, 248 nm, 15 mJ laser pulse onto a number of selected targets known to be good XUV continuum emitters (Sm, W, Au and Pb), while the second absorbing plasma was produced by a 15 ns, 1064 nm, 300 mi pulse. The duration of the continuum emission for these plasmas has a mean value of similar to 150 ps, but depends on both the target material and the picosecond laser pulse energy. Using this picosecond DLP set-up we have been able to measure the photoabsorption spectrum of an actinide ion (thorium) for the first time.