965 resultados para Freezing and processing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In industrial plants, oil and oil compounds are usually transported by closed pipelines with circular cross-section. The use of radiotracers in oil transport and processing industrial facilities allows calibrating flowmeters, measuring mean residence time in cracking columns, locate points of obstruction or leak in underground ducts, as well as investigating flow behavior or industrial processes such as in distillation towers. Inspection techniques using radiotracers are non-destructive, simple, economic and highly accurate. Among them, Total Count, which uses a small amount of radiotracer with known activity, is acknowledged as an absolute technique for flow rate measurement. A viscous fluid transport system, composed by four PVC pipelines with 13m length (12m horizontal and 1m vertical) and ½, ¾, 1 and 2-inch gauges, respectively, interconnected by maneuvering valves was designed and assembled in order to conduct the research. This system was used to simulate different flow conditions of petroleum compounds and for experimental studies of flow profile in the horizontal and upward directions. As 198Au presents a single photopeak (411,8 keV), it was the radioisotope chosen for oil labeling, in small amounts (6 ml) or around 200 kBq activity, and it was injected in the oil transport lines. A NaI scintillation detector 2”x 2”, with well-defined geometry, was used to measure total activity, determine the calibration factor F and, positioned after a homogenization distance and interconnected to a standardized electronic set of nuclear instrumentation modules (NIM), to detect the radioactive cloud.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Despite their increasing popularity, little is known about how users perceive mobile devices such as smartphones and tablet PCs in medical contexts. Available studies are often restricted to evaluating the success of specific interventions and do not adequately cover the users' basic attitudes, for example, their expectations or concerns toward using mobile devices in medical settings. OBJECTIVE: The objective of the study was to obtain a comprehensive picture, both from the perspective of the patients, as well as the doctors, regarding the use and acceptance of mobile devices within medical contexts in general well as the perceived challenges when introducing the technology. METHODS: Doctors working at Hannover Medical School (206/1151, response 17.90%), as well as patients being admitted to this facility (213/279, utilization 76.3%) were surveyed about their acceptance and use of mobile devices in medical settings. Regarding demographics, both samples were representative of the respective study population. GNU R (version 3.1.1) was used for statistical testing. Fisher's exact test, two-sided, alpha=.05 with Monte Carlo approximation, 2000 replicates, was applied to determine dependencies between two variables. RESULTS: The majority of participants already own mobile devices (doctors, 168/206, 81.6%; patients, 110/213, 51.6%). For doctors, use in a professional context does not depend on age (P=.66), professional experience (P=.80), or function (P=.34); gender was a factor (P=.009), and use was more common among male (61/135, 45.2%) than female doctors (17/67, 25%). A correlation between use of mobile devices and age (P=.001) as well as education (P=.002) was seen for patients. Minor differences regarding how mobile devices are perceived in sensitive medical contexts mostly relate to data security, patients are more critical of the devices being used for storing and processing patient data; every fifth patient opposed this, but nevertheless, 4.8% of doctors (10/206) use their devices for this purpose. Both groups voiced only minor concerns about the credibility of the provided content or the technical reliability of the devices. While 8.3% of the doctors (17/206) avoided use during patient contact because they thought patients might be unfamiliar with the devices, (25/213) 11.7% of patients expressed concerns about the technology being too complicated to be used in a health context. CONCLUSIONS: Differences in how patients and doctors perceive the use of mobile devices can be attributed to age and level of education; these factors are often mentioned as contributors of the problems with (mobile) technologies. To fully realize the potential of mobile technologies in a health care context, the needs of both the elderly as well as those who are educationally disadvantaged need to be carefully addressed in all strategies relating to mobile technology in a health context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nucleic acids play key roles in the storage and processing of genetic information, as well as in the regulation of cellular processes. Consequently, they represent attractive targets for drugs against gene-related diseases. On the other hand, synthetic oligonucleotide analogues have found application as chemotherapeutic agents targeting cellular DNA and RNA. The development of effective nucleic acid-based chemotherapeutic strategies requires adequate analytical techniques capable of providing detailed information about the nucleotide sequences, the presence of structural modifications, the formation of higher-order structures, as well as the interaction of nucleic acids with other cellular components and chemotherapeutic agents. Due to the impressive technical and methodological developments of the past years, tandem mass spectrometry has evolved to one of the most powerful tools supporting research related to nucleic acids. This review covers the literature of the past decade devoted to the tandem mass spectrometric investigation of nucleic acids, with the main focus on the fundamental mechanistic aspects governing the gas-phase dissociation of DNA, RNA, modified oligonucleotide analogues, and their adducts with metal ions. Additionally, recent findings on the elucidation of nucleic acid higher-order structures by tandem mass spectrometry are reviewed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis discusses market design and regulation in electricity systems, focusing on the information exchange of the regulated grid firm and the generation firms as well as the regulation of the grid firm. In the first chapter, an economic framework is developed to consistently analyze different market designs and the information exchange between the grid firm and the generation firms. Perfect competition between the generation firms and perfect regulation of the grid firm is assumed. A numerical algorithm is developed and its feasibility demonstrated on a large-scale problem. The effects of different market designs for the Central Western European (CWE) region until 2030 are analyzed. In the second chapter, the consequences of restricted grid expansion within the current market design in the CWE region until 2030 are analyzed. In the third chapter the assumption of efficient markets is modified. The focus of the analysis is then, whether and how inefficiencies in information availability and processing affect different market designs. For different parameter settings, nodal and zonal pricing are compared regarding their welfare in the spot and forward market. In the fourth chapter, information asymmetries between the regulator and the regulated firm are analyzed. The optimal regulatory strategy for a firm, providing one output with two substitutable inputs, is defined. Thereby, one input and the absolute quantity of inputs is not observable for the regulator. The result is then compared to current regulatory approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Chitosan is a polysaccharide derived from chitin, mainly of crustacean shells and shrimp wastes. The utilization of chitosan is related to the molar weight and deacetylation degree of the biopolymer. The aim of this work is to study the chitin deacetylation reaction, by the viscosity average molar weight and deacetylation degree of chitosan as a function of reaction time. Deacetylation was carried out in concentrated alkaline solution, 421 g L−1, at 130◦C and the reaction occurred during 4 h. Chitosan paste obtained after 20, 90 and 240 min was used to produce biofilms, which were characterized according water vapor permeability and mechanical properties (tensile strength and percentage tensile elongation at break). During the reaction time deacetylation degree reached 93%, and a 50% reduction in the viscosity average molar weight value in relation to the value of the first 20 min of reaction was found Both reactions presented a kinetic behavior of the pseudo-first order. Biofilm produced from the paste of chitosan with high deacetylation degree showed higher water vapor permeability (WVP), tensile strength (TS) and elongation (E) when compared to films with a low deacetylation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências Agrárias (Reprodução Animal), 26 de Junho de 2013, Universidade dos Açores.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Solution-grown colloidal nanocrystal (NC) materials represent ideal candidates for optoelectronic devices, due to the flexibility with which they can be synthesized, the ease with which they can be processed for devicefabrication purposes and, foremost, for their excellent and size-dependent tunable optical properties, such as high photoluminescence (PL) quantum yield, color purity, and broad absorption spectra up to the near infrared. The advent of surfactant-assisted synthesis of thermodynamically stable colloidal solutions of NCs has led to peerless results in terms of uniform size distribution, composition, rational shape-design and the possibility of building heterostructured NCs (HNCs) comprising two or more different materials joined together. By tailoring the composition, shape and size of each component, HNCs with gradually higher levels of complexity have been conceived and realized, which are endowed with outstanding characteristics and optoelectronic properties. In this review, we discuss recent advances in the design of HNCs for efficient light-emitting diodes (LEDs) and photovoltaic (PV) solar cell devices. In particular, we will focus on the materials required to obtain superior optoelectronic quality and efficient devices, as well as their preparation and processing potential and limitations

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tomato (Lycopersicon esculentum Mill.) is the second most important vegetable crop worldwide and a rich source of hydrophilic (H) and lipophilic (L) antioxidants. The H fraction is constituted mainly by ascorbic acid and soluble phenolic compounds, while the L fraction contains carotenoids (mostly lycopene), tocopherols, sterols and lipophilic phenolics [1,2]. To obtain these antioxidants it is necessary to follow appropriate extraction methods and processing conditions. In this regard, this study aimed at determining the optimal extraction conditions for H and L antioxidants from a tomato surplus. A 5-level full factorial design with 4 factors (extraction time (I, 0-20 min), temperature (T, 60-180 •c), ethanol percentage (Et, 0-100%) and solid/liquid ratio (S/L, 5-45 g!L)) was implemented and the response surface methodology used for analysis. Extractions were carried out in a Biotage Initiator Microwave apparatus. The concentration-time response methods of crocin and P-carotene bleaching were applied (using 96-well microplates), since they are suitable in vitro assays to evaluate the antioxidant activity of H and L matrices, respectively [3]. Measurements were carried out at intervals of 3, 5 and 10 min (initiation, propagation and asymptotic phases), during a time frame of 200 min. The parameters Pm (maximum protected substrate) and V m (amount of protected substrate per g of extract) and the so called IC50 were used to quantify the response. The optimum extraction conditions were as follows: r~2.25 min, 7'=149.2 •c, Et=99.1 %and SIL=l5.0 giL for H antioxidants; and t=l5.4 min, 7'=60.0 •c, Et=33.0% and S/L~l5.0 g/L for L antioxidants. The proposed model was validated based on the high values of the adjusted coefficient of determination (R2.wi>0.91) and on the non-siguificant differences between predicted and experimental values. It was also found that the antioxidant capacity of the H fraction was much higher than the L one.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado, Engenharia Electrónica e Telecomunicações, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2011

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pseudokirchneriella subcapitata is a unicellular green algae widely distributed in freshwater and soils. Due to its cosmopolitan characteristic, its use is recommended by national and international protocols in ecotoxicity studies. The alteration of phosphatase activities by agriculture pollutants like heavy metals has been extensively used as a biomarker in risk assessment and biomonitoring. In this study, we compared the extraction of acid phosphatase from P. subcapitata by different procedures and we studied the stability, substrates specificity, kinetics and the effect of Hg2+ in the crude extract. The freezing and thawing technique associated with probe sonication was the most suitable method of extraction. The enzyme was stable when frozen at -20ºC for at least six months, showed an optimum pH of 5 and a Km value of 0.27 mM for p-nitrophenylphosphate (pNPP) as substrate. Some natural organic substrates were cleaved by a similar extent as the synthetic substrate pNPP. Short term exposure (24 hours) to Hg2+ had little effect but inhibition of the specific activity was observed after 7 days with EC50 (concentration of Hg2+ that promotes 50% decrease of specific activity) value of 12.63 μM Hg2+ .