903 resultados para Information storage and retrieval systems
Resumo:
The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation.^ In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data.^ For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.^
Resumo:
Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.
Resumo:
Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.
Resumo:
Carbon capture and storage (CCS) represents an interesting climate mitigation option, however, as for any other human activity, there is the impelling need to assess and manage the associated risks. This study specifically addresses the marine environmental risk posed by CO2 leakages associated to CCS subsea engineering system, meant as offshore pipelines and injection / plugged and abandoned wells. The aim of this thesis work is to start approaching the development of a complete and standardized practical procedure to perform a quantified environmental risk assessment for CCS, with reference to the specific activities mentioned above. Such an effort would be of extreme relevance not only for companies willing to implement CCS, as a methodological guidance, but also, by uniformizing the ERA procedure, to begin changing people’s perception about CCS, that happens to be often discredited due to the evident lack of comprehensive and systematic methods to assess the impacts on the marine environment. The backbone structure of the framework developed consists on the integration of ERA’s main steps and those belonging to the quantified risk assessment (QRA), in the aim of quantitatively characterizing risk and describing it as a combination of magnitude of the consequences and their frequency. The framework developed by this work is, however, at a high level, as not every single aspect has been dealt with in the required detail. Thus, several alternative options are presented to be considered for use depending on the situation. Further specific studies should address their accuracy and efficiency and solve the knowledge gaps emerged, in order to establish and validate a final and complete procedure. Regardless of the knowledge gaps and uncertainties, that surely need to be addressed, this preliminary framework already finds some relevance in on field applications, as a non-stringent guidance to perform CCS ERA, and it constitutes the foundation of the final framework.
Resumo:
The power transformer is a piece of electrical equipment that needs continuous monitoring and fast protection since it is very expensive and an essential element for a power system to perform effectively. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can affect the protection behavior and the power system stability. This paper proposes the development of a new algorithm to improve the differential protection performance by using fuzzy logic and Clarke`s transform. An electrical power system was modeled using Alternative Transients Program (ATP) software to obtain the operational conditions and fault situations needed to test the algorithm developed. The results were compared to a commercial relay for validation, showing the advantages of the new method.
Resumo:
The photodegradation of the herbicide clomazone in the presence of S(2)O(8)(2-) or of humic substances of different origin was investigated. A value of (9.4 +/- 0.4) x 10(8) m(-1) s(-1) was measured for the bimolecular rate constant for the reaction of sulfate radicals with clomazone in flash-photolysis experiments. Steady state photolysis of peroxydisulfate, leading to the formation of the sulfate radicals, in the presence of clomazone was shown to be an efficient photodegradation method of the herbicide. This is a relevant result regarding the in situ chemical oxidation procedures involving peroxydisulfate as the oxidant. The main reaction products are 2-chlorobenzylalcohol and 2-chlorobenzaldehyde. The degradation kinetics of clomazone was also studied under steady state conditions induced by photolysis of Aldrich humic acid or a vermicompost extract (VCE). The results indicate that singlet oxygen is the main species responsible for clomazone degradation. The quantum yield of O(2)(a(1)Delta(g)) generation (lambda = 400 nm) for the VCE in D(2)O, Phi(Delta) = (1.3 +/- 0.1) x 10(-3), was determined by measuring the O(2)(a(1)Delta(g)) phosphorescence at 1270 nm. The value of the overall quenching constant of O(2)(a(1)Delta(g)) by clomazone was found to be (5.7 +/- 0.3) x 10(7) m(-1) s(-1) in D(2)O. The bimolecular rate constant for the reaction of clomazone with singlet oxygen was k(r) = (5.4 +/- 0.1) x 10(7) m(-1) s(-1), which means that the quenching process is mainly reactive.
Resumo:
The climatic water balance is one of the most used tools to assess, indirectly the amount of water present in the soil is capable of meeting the water needs of the plant. This study analyzed the climatologic hydric balance, the effective soil water storage and coffee plant transpiration in dry regimen cultivation. Daily climatologic hydric balance was calculated for coffee from January 2003 to May 2006. It was concluded that even in the most rainy months of the year, there is a hydric deficit in coffee plants grown in a dry regimen; effective soil water storage varied greatly through the years evaluated, and September was the most critical month, when this value remained below 30%; relative transpiration can not be taken as the single evaluation method for yield losses of coffee, grown in a dry regimen.
Resumo:
Nitrogen fertilization in common bean crops under no-tillage and conventional systems. Nitrogen fertilizer is necessary for high yields in common bean crops and N responses under conditions of no-tillage and conventional systems are still basic needs. Thus, the objective of this research was to evaluate the effect of N application and common bean yield in no-tillage and conventional systems. The experimental design was a randomized block in a factorial scheme (2x8+1) with four replications. The treatments were constituted by the combination of two N doses (40 and 80 kg ha(-1)) applied at side dressing at eight distinct stadia during vegetative development of the common bean (V(4-3), V(4-4), V(4-5), V(4-6), V(4-7), V(4-8), V(4-9) and V(4-10)), in addition to a control plot without N in side dressing. The experiment was conducted over two years (2002 and 2003) in no-tillage on millet crop residues and conventional plow system. It was concluded that N fertilizer at the V(4) stadium of common bean promotes similar seed yields in no-tillage and conventional systems. Yield differences between no-tillage and conventional systems are inconsistent in the same agricultural area.
Resumo:
Hydrodynamic studies were conducted in a semi-cylindrical spouted bed column of diameter 150 mm, height 1000 mm, conical base included angle of 60 degrees and inlet orifice diameter 25 mm. Pressure transducers at several axial positions were used to obtain pressure fluctuation time series with 1.2 and 2.4 mm glass beads at U/U-ms from 0.3 to 1.6, and static bed depths from 150 to 600 mm. The conditions covered several flow regimes (fixed bed, incipient spouting, stable spouting, pulsating spouting, slugging, bubble spouting and fluidization). Images of the system dynamics were also acquired through the transparent walls with a digital camera. The data were analyzed via statistical, mutual information theory, spectral and Hurst`s Rescaled Range methods to assess the potential of these methods to characterize the spouting quality. The results indicate that these methods have potential for monitoring spouted bed operation.
Resumo:
The aim of this study was to compare the effects of Low-intensity Laser Therapy (LILT) and Light Emitting Diode Therapy (LEDT) of low intensity on the treatment of lesioned Achilles tendon of rats. The experimental model consisted of a partial mechanical lesion on the right Achilles tendon deep portion of 90 rats. One hour after the lesion, the injured animals received applications of laser/LED (685, 830/630, 880 nm), and the same procedure was repeated at 24-h intervals, for 10 days. The healing process and deposition of collagen were evaluated based on a polarization microscopy analysis of the alignment and organization of collagen bundles, through the birefringence (optical retardation-OR). The results showed a real efficiency of treatments based on LEDT and confirmed that LILT seems to be effective on healing process. Although absence of coherence of LED light, tendon healing treatment with this feature was satisfactory and can certainly replace treatments based on laser light applications. Applications of infrared laser at 830 nm and LED 880 nm were more efficient when the aim is a good organization, aggregation, and alignment of the collagen bundles on tendon healing. However, more research is needed for a safety and more efficient determination of a protocol with LED.
Resumo:
Xylem sap from woody species in the wet/dry tropics of northern Australia was analyzed for N compounds. At the peak of the dry season, arginine was the main N compound in sap of most species of woodlands and deciduous monsoon forest. In the wet season, a marked change occurred with amides becoming the main sap N constituents of most species. Species from an evergreen monsoon forest, with a permanent water source, transported amides in the dry season. In the dry season, nitrate accounted for 7 and 12% of total xylem sap N in species of deciduous and evergreen monsoon forests, respectively In the wet season, the proportion of N present as nitrate increased to 22% in deciduous monsoon forest species. These results suggest that N is taken up and assimilated mainly in the wet season and that this newly assimilated N is mostly transported as amide-N (woodland species, monsoon forest species) and nitrate (monsoon forest species). Arginine is the form in which stored N is remobilized and transported by woodland and deciduous monsoon forest species in the dry season. Several proteins, which may represent bark storage proteins, were detected in inner bark tissue from a range of trees in the dry season, indicating that, although N uptake appears to be limited in the dry season, the many tree and shrub species that produce flowers, fruit or leaves in the dry season use stored N to support growth. Nitrogen characteristics of the studied species are discussed in relation to the tropical environment.
Resumo:
We use theoretical and numerical methods to investigate the general pore-fluid flow patterns near geological lenses in hydrodynamic and hydrothermal systems respectively. Analytical solutions have been rigorously derived for the pore-fluid velocity, stream function and excess pore-fluid pressure near a circular lens in a hydrodynamic system. These analytical solutions provide not only a better understanding of the physics behind the problem, but also a valuable benchmark solution for validating any numerical method. Since a geological lens is surrounded by a medium of large extent in nature and the finite element method is efficient at modelling only media of finite size, the determination of the size of the computational domain of a finite element model, which is often overlooked by numerical analysts, is very important in order to ensure both the efficiency of the method and the accuracy of the numerical solution obtained. To highlight this issue, we use the derived analytical solutions to deduce a rigorous mathematical formula for designing the computational domain size of a finite element model. The proposed mathematical formula has indicated that, no matter how fine the mesh or how high the order of elements, the desired accuracy of a finite element solution for pore-fluid flow near a geological lens cannot be achieved unless the size of the finite element model is determined appropriately. Once the finite element computational model has been appropriately designed and validated in a hydrodynamic system, it is used to examine general pore-fluid flow patterns near geological lenses in hydrothermal systems. Some interesting conclusions on the behaviour of geological lenses in hydrodynamic and hydrothermal systems have been reached through the analytical and numerical analyses carried out in this paper.