864 resultados para Sharable Content Object Resource Model (SCORM)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The neutron-to-proton ratio of the structure functions, F(2)(n)/F(2)(p), as well as the corresponding difference F(2)(p)-F(2)(n) are obtained within a statistical quark model for the nucleon, where the quark energy levels are given by a central linear confining potential.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fresh persimmon has a high moisture content (about 85% wet basis) making it highly perishable and requiring adequate drying conditions to obtain an acceptable dehydrated product. Drying kinetics of persimmon cv. Rama Forte was studied in a fixed bed dryer at temperatures ranging from 50 to 80 degreesC and air velocity of 0.8 m/s. Shrinkage during drying was described by a linear correlation with respect to water content. Evaluation of effective diffusivity as a function of moisture content, with undergoing shrinkage during drying was based on Fourier series solution of Fick's diffusion equation. Effective diffusivity values at moisture contents between 0.09 - 4.23 kg water/kg dry matter were found to be in the range of 2.6 x 10(-10) m(2)/s to 5.4 x 10(-10) m(2)/s, and its dependence on air drying temperature was represented by an Arrhenius type equation. Activation energy increased with decreasing water content in persimmons.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyse the properties of the Sp(1, R) model states using a basis obtained from the deformed harmonic oscillator wavefunctions. We make an Sp(1, R) calculation for C-12 and consider bases obtained from oblate, triaxial and prolate intrinsic states. The model states are given by angular momentum projection of vibrational phonons, which are associated with giant monopole and quadrupole resonances.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The strangeness content of the nucleon is determined from a statistical model using confined quark levels, and is shown to have a good agreement with the corresponding values extracted from experimental data. The quark levels are generated in a Dirac equation that uses a linear confining potential (scalar plus vector). With the requirement that the result for the Gottfried sum rule violation, given by the New Muon Collaboration (NMC), is well reproduced, we also obtain the difference between the structure functions of the proton and neutron, and the corresponding sea quark contributions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Simulations of overshooting, tropical deep convection using a Cloud Resolving Model with bulk microphysics are presented in order to examine the effect on the water content of the TTL (Tropical Tropopause Layer) and lower stratosphere. This case study is a subproject of the HIBISCUS (Impact of tropical convection on the upper troposphere and lower stratosphere at global scale) campaign, which took place in Bauru, Brazil (22° S, 49° W), from the end of January to early March 2004. Comparisons between 2-D and 3-D simulations suggest that the use of 3-D dynamics is vital in order to capture the mixing between the overshoot and the stratospheric air, which caused evaporation of ice and resulted in an overall moistening of the lower stratosphere. In contrast, a dehydrating effect was predicted by the 2-D simulation due to the extra time, allowed by the lack of mixing, for the ice transported to the region to precipitate out of the overshoot air. Three different strengths of convection are simulated in 3-D by applying successively lower heating rates (used to initiate the convection) in the boundary layer. Moistening is produced in all cases, indicating that convective vigour is not a factor in whether moistening or dehydration is produced by clouds that penetrate the tropopause, since the weakest case only just did so. An estimate of the moistening effect of these clouds on an air parcel traversing a convective region is made based on the domain mean simulated moistening and the frequency of convective events observed by the IPMet (Instituto de Pesquisas Meteorológicas, Universidade Estadual Paulista) radar (S-band type at 2.8 Ghz) to have the same 10 dBZ echo top height as those simulated. These suggest a fairly significant mean moistening of 0.26, 0.13 and 0.05 ppmv in the strongest, medium and weakest cases, respectively, for heights between 16 and 17 km. Since the cold point and WMO (World Meteorological Organization) tropopause in this region lies at ∼ 15.9 km, this is likely to represent direct stratospheric moistening. Much more moistening is predicted for the 15-16 km height range with increases of 0.85-2.8 ppmv predicted. However, it would be required that this air is lofted through the tropopause via the Brewer Dobson circulation in order for it to have a stratospheric effect. Whether this is likely is uncertain and, in addition, the dehydration of air as it passes through the cold trap and the number of times that trajectories sample convective regions needs to be taken into account to gauge the overall stratospheric effect. Nevertheless, the results suggest a potentially significant role for convection in determining the stratospheric water content. Sensitivity tests exploring the impact of increased aerosol numbers in the boundary layer suggest that a corresponding rise in cloud droplet numbers at cloud base would increase the number concentrations of the ice crystals transported to the TTL, which had the effect of reducing the fall speeds of the ice and causing a ∼13% rise in the mean vapour increase in both the 15-16 and 16-17 km height ranges, respectively, when compared to the control case. Increases in the total water were much larger, being 34% and 132% higher for the same height ranges, but it is unclear whether the extra ice will be able to evaporate before precipitating from the region. These results suggest a possible impact of natural and anthropogenic aerosols on how convective clouds affect stratospheric moisture levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Based on literature review, electronic systems design employ largely top-down methodology. The top-down methodology is vital for success in the synthesis and implementation of electronic systems. In this context, this paper presents a new computational tool, named BD2XML, to support electronic systems design. From a block diagram system of mixed-signal is generated object code in XML markup language. XML language is interesting because it has great flexibility and readability. The BD2XML was developed with object-oriented paradigm. It was used the AD7528 converter modeled in MATLAB / Simulink as a case study. The MATLAB / Simulink was chosen as a target due to its wide dissemination in academia and industry. From this case study it is possible to demonstrate the functionality of the BD2XML and make it a reflection on the design challenges. Therefore, an automatic tool for electronic systems design reduces the time and costs of the design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research presents as its central theme the study of the bibliographic record conversion process. The object of study is framed by an understanding of analogic bibliographic record conversion to the Bibliograhpic MARC21 format, based on a syntactic and semantic analysis of records described according to descriptive metadata structure standards and content standards. The objective of this research the objective is to develop a theoretical-conceptual model of syntactic and semantic of bibliographic records, from Linguistic studies of Saussure and Hjelmslev of manifestations of human language, which subsidizes the development of a computacional interpreter, focused to the conversion of bibliographic records to MARC21 Bibliographic Format, which can be confirmed both the semantic value of the informational resource represented as the reliability of the representation. Given the aforementioned objectives, the methodological trajectory of the research is based on the qualitative approach, of an exploratory, descriptive and experimental nature, and with recourse to the literature. Contributions on the theoretical plane can be envisaged regarding the development of questions inherent to the syntactic and semantic aspects of bibliographic records, and by involving, at the same time, interdisciplinarity between Information Science, Computer Science and Linguistics. Contributions to the practical field are identified by the fact the study covers the development of the Scan for MARC, a computational interpreter that can be adopted by any institution that wishes to use the conversion procedure for bibliographic record databases to the MARC21 Bibliographic Format from description and visualization schemes of bibliographic records (AACR2r and ISBD), an aspect of the research which is considered innovative.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Maize demand for food, livestock feed, and biofuel is expected to increase substantially. The Western U.S. Corn Belt accounts for 23% of U.S. maize production, and irrigated maize accounts for 43 and 58% of maize land area and total production, respectively, in this region. The most sensitive parameters (yield potential [YP], water-limited yield potential [YP-W], yield gap between actual yield and YP, and resource-use efficiency) governing performance of maize systems in the region are lacking. A simulation model was used to quantify YP under irrigated and rainfed conditions based on weather data, soil properties, and crop management at 18 locations. In a separate study, 5-year soil water data measured in central Nebraska were used to analyze soil water recharge during the non-growing season because soil water content at sowing is a critical component of water supply available for summer crops. On-farm data, including yield, irrigation, and nitrogen (N) rate for 777 field-years, was used to quantify size of yield gaps and evaluate resource-use efficiency. Simulated average YP and YP-W were 14.4 and 8.3 Mg ha-1, respectively. Geospatial variation of YP was associated with solar radiation and temperature during post-anthesis phase while variation in water-limited yield was linked to the longitudinal variation in seasonal rainfall and evaporative demand. Analysis of soil water recharge indicates that 80% of variation in soil water content at sowing can be explained by precipitation during non-growing season and residual soil water at end of previous growing season. A linear relationship between YP-W and water supply (slope: 19.3 kg ha-1 mm-1; x-intercept: 100 mm) can be used as a benchmark to diagnose and improve farmer’s water productivity (WP; kg grain per unit of water supply). Evaluation of data from farmer’s fields provides proof-of-concept and helps identify management constraints to high levels of productivity and resource-use efficiency. On average, actual yields of irrigated maize systems were 11% below YP. WP and N-fertilizer use efficiency (NUE) were high despite application of large amounts of irrigation water and N fertilizer (14 kg grain mm-1 water supply and 71 kg grain kg-1 N fertilizer). While there is limited scope for substantial increases in actual average yields, WP and NUE can be further increased by: (1) switching surface to pivot systems, (2) using conservation instead of conventional tillage systems in soybean-maize rotations, (3) implementation of irrigation schedules based on crop water requirements, and (4) better N fertilizer management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

For broadcasting purposes MIXED REALITY, the combination of real and virtual scene content, has become ubiquitous nowadays. Mixed Reality recording still requires expensive studio setups and is often limited to simple color keying. We present a system for Mixed Reality applications which uses depth keying and provides threedimensional mixing of real and artificial content. It features enhanced realism through automatic shadow computation which we consider a core issue to obtain realism and a convincing visual perception, besides the correct alignment of the two modalities and correct occlusion handling. Furthermore we present a possibility to support placement of virtual content in the scene. Core feature of our system is the incorporation of a TIME-OF-FLIGHT (TOF)-camera device. This device delivers real-time depth images of the environment at a reasonable resolution and quality. This camera is used to build a static environment model and it also allows correct handling of mutual occlusions between real and virtual content, shadow computation and enhanced content planning. The presented system is inexpensive, compact, mobile, flexible and provides convenient calibration procedures. Chroma-keying is replaced by depth-keying which is efficiently performed on the GRAPHICS PROCESSING UNIT (GPU) by the usage of an environment model and the current ToF-camera image. Automatic extraction and tracking of dynamic scene content is herewith performed and this information is used for planning and alignment of virtual content. An additional sustainable feature is that depth maps of the mixed content are available in real-time, which makes the approach suitable for future 3DTV productions. The presented paper gives an overview of the whole system approach including camera calibration, environment model generation, real-time keying and mixing of virtual and real content, shadowing for virtual content and dynamic object tracking for content planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^