918 resultados para Cleaning machinery and appliances
Resumo:
High temperature, high pressure transcritical condensing CO2 cycle (TC-CO2) is compared with transcritical steam (TC-steam) cycle. Performance indicators such as thermal efficiency, volumetric flow rates and entropy generation are used to analyze the power cycle wherein, irreversibilities in turbo-machinery and heat exchangers are taken into account. Although, both cycles yield comparable thermal efficiencies under identical operating conditions, TC-CO2 plant is significantly compact compared to a TC-steam plant. Large specific volume of steam is responsible for a bulky system. It is also found that the performance of a TC-CO2 cycle is less sensitive to source temperature variations, which is an important requirement of a solar thermal system. In addition, issues like wet expansion in turbine and vacuum in condenser are absent in case of a TC-CO2 cycle. External heat addition to working fluid is assumed to take place through a heat transfer fluid (HTF) which receives heat from a solar receiver. A TC-CO2 system receives heat though a single HTF loop, whereas, for TC-steam cycle two HTF loops in series are proposed to avoid high temperature differential between the steam and HTF. (C) 2013 P. Garg. Published by Elsevier Ltd.
Resumo:
Nonhomologous DNA end joining (NHEJ) is one of the major double-strand break (DSB) repair pathways in higher eukaryotes. Recently, it has been shown that alternative NHEJ (A-NHEJ) occurs in the absence of classical NHEJ and is implicated in chromosomal translocations leading to cancer. In the present study, we have developed a novel biochemical assay system utilizing DSBs flanked by varying lengths of microhomology to study microhomology-mediated alternative end joining (MMEJ). We show that MMEJ can operate in normal cells, when microhomology is present, irrespective of occurrence of robust classical NHEJ. Length of the microhomology determines the efficiency of MMEJ, 5 nt being obligatory. Using this biochemical approach, we show that products obtained are due to MMEJ, which is dependent on MRE11, NBS1, LIGASE III, XRCC1, FEN1 and PARP1. Thus, we define the enzymatic machinery and microhomology requirements of alternative NHEJ using a well-defined biochemical system.
Resumo:
We hypothesized that the AAV2 vector is targeted for destruction in the cytoplasm by the host cellular kinase/ubiquitination/proteasomal machinery and that modification of their targets on AAV2 capsid may improve its transduction efficiency. In vitro analysis with pharmacological inhibitors of cellular serine/threonine kinases (protein kinase A, protein kinase C, casein kinase II) showed an increase (20-90%) on AAV2-mediated gene expression. The three-dimensional structure of AAV2 capsid was then analyzed to predict the sites of ubiquitination and phosphorylation. Three phosphodegrons, which are the phosphorylation sites recognized as degradation signals by ubiquitin ligases, were identified. Mutation targets comprising eight serine (S) or seven threonine (T) or nine lysine (K) residues were selected in and around phosphodegrons on the basis of their solvent accessibility, overlap with the receptor binding regions, overlap with interaction interfaces of capsid proteins, and their evolutionary conservation across AAV serotypes. AAV2-EGFP vectors with the wild-type (WT) capsid or mutant capsids (15 S/T -> alanine A] or 9 K -> arginine R] single mutant or 2 double K -> R mutants) were then evaluated in vitro. The transduction efficiencies of 11 S/T -> A and 7 K -> R vectors were significantly higher (similar to 63-90%) than the AAV2-WT vectors (similar to 30-40%). Further, hepatic gene transfer of these mutant vectors in vivo resulted in higher vector copy numbers (up to 4.9-fold) and transgene expression (up to 14-fold) than observed from the AAV2-WT vector. One of the mutant vectors, S489A, generated similar to 8-fold fewer antibodies that could be cross-neutralized by AAV2-WT. This study thus demonstrates the feasibility of the use of these novel AAV2 capsid mutant vectors in hepatic gene therapy.
Resumo:
In April 2005, a SHOALS 1000T LIDAR system was used as an efficient alternative for safely acquiring data to describe the existing conditions of nearshore bathymetry and the intertidal zone over an approximately 40.7 km2 (11.8 nm2) portion of hazardous coastline within the Olympic Coast National Marine Sanctuary (OCNMS). Data were logged from 1,593 km (860 nm) of track lines in just over 21 hours of flight time. Several islands and offshore rocks were also surveyed, and over 24,000 geo-referenced digital still photos were captured to assist with data cleaning and QA/QC. The 1 kHz bathymetry laser obtained a maximum water depth of 22.2 meters. Floating kelp beds, breaking surf lines and turbid water were all challenges to the survey. Although sea state was favorable for this time of the year, recent heavy rainfall and a persistent low-lying layer of fog reduced acquisition productivity. The existence of a completed VDatum model covering this same geographic region permitted the LIDAR data to be vertically transformed and merged with existing shallow water multibeam data and referenced to the mean lower low water (MLLW) tidal datum. Analysis of a multibeam bathymetry-LIDAR difference surface containing over 44,000 samples indicated surface deviations from –24.3 to 8.48 meters, with a mean difference of –0.967 meters, and standard deviation of 1.762 meters. Errors in data cleaning and false detections due to interference from surf, kelp, and turbidity likely account for the larger surface separations, while the remaining general surface difference trend could partially be attributed to a more dense data set, and shoal-biased cleaning, binning and gridding associated with the multibeam data for maintaining conservative least depths important for charting dangers to navigation. (PDF contains 27 pages.)
Resumo:
The present study aimed production of a new product with various texture and sensory properties in chase of the impetus for increasing human consumption considering suitable resources of Kilka fish in Caspian Sea. Following deheading, gutting, and brining, common Kilka were battered in two different formulations, i.e. simple batter and tempura batter, via automated predusting machinery and then, they were fried through flash frying for 30 seconds at 170°C in sunflower oil after they were breaded with bread crumbs flour. The products were subjected to continuous freezing at -40°C and were kept at -18°C in cold storage for four months once they were packed. Chemical composition (protein, fat, moisture, and ash), fatty acid profiles (29 fatty acids), chemical indices of spoilage (peroxide value, thiobarbituric acid, free fatty acids, and volatile nitrogen), and microbial properties (total bacteria count and coliform count) were compared in fresh and breaded Kilka at various times before frying (raw breaded Kilka), after frying (zero-phase), and in various months of frozen storage (phases 1, 2, 3, and 4). Organoleptic properties of breaded Kilka (i.e. odor, taste, texture, crispiness, cohesiveness of batter) and general acceptability in the phases 0, 1, 2, 3, and 4 were evaluated. The results obtained from chemical composition and fatty acid profiles in common Kilka denoted that MUFA, PUFA, and SFA were estimated to be 36.96, 32.85, and 29.12 g / 100g lipid, respectively. Levels of ù-3 and ù-6 were 7.6 and 1.12 g / 100 gr lipid, respectively. Docosahexaonoic acid (20.79%) was the highest fatty acid in PUFA group. ù-3/ù-6 and PUFA/SFA ratios were 7.6 and 1.12, respectively. The high rates of the indices and high percentage of ù-3 fatty acid in common Kilka showed that the fish can be considered as invaluable nutritional and fishery resources and commonsensical consumption of the species may reduce the risk of cardiovascular diseases. Frying breaded Kilka affected overall fat and moisture contents so that moisture content in fried breaded Kilka decreased significantly compared to raw breaded Kilka, while it was absolutely reverse for fat content. Overall fat content in tempura batter treatment was significantly lower than that of simple batter treatment (P≤0.05). Presence of hydrocolloids, namely proteins, starch, gum, and other polysaccharides, in tempura batter may prohibit moisture evaporation and placement with oil during frying process in addition to boosting water holding capacity through confining water molecules. During frying process, fatty acids composition of breaded Kilka with various batters changed so that rates of some fatty acids such as Palmitic acid (C16:0), Stearic acid (C18:0), Oleic acid (C18:1 ù-9cis), and linoleic acid (C18:3 ù-3) increased considerably following frying; however, ù-3/ù-6, PUFA/SFA, and EPA+DHA/C16:0 ratios (Polyan index) decreased significantly after frying. ù-3/ù-6, PUFA/SFA, and EPA+DHA/C16:0 ratios in tempura batter treatment were higher than those of simple batter treatment which is an indicator of higher nutritional value of breaded Kilka with tempura batter. Significant elevations were found in peroxide, thiobarbituric acid, and free fatty acids in fried breaded Kilka samples compared to raw samples which points to fat oxidation during cooking process. Overall microorganism count and coliform count decreased following heating process. Both breaded Kilka samples were of high sanitation quality at zero-phase according to ICMSF Standard. The results acquired from organoleptic evaluation declared that odor, cohesiveness, and general acceptability indices, among others, had significant differences between the treatments (P≤0.05). In all evaluated properties, breaded Kilka with tempura batter in different phases gained higher scores than breaded Kilka with simple batter. During cold storage of various treatments of breaded Kilka, total lipid content, PUFA, MUFA, ù-3, ù- 3/ù-6, PUFA/SFA, Polyen index decreased significantly. The mentioned reductions in addition to significant elevation of spoilage indices, namely peroxide, thiobarbituric acid, and free fatty acids, during frozen storage, indicate to oxidation and enzymatic mechanism activity during frozen storage of breaded Kilka. Considering sensory evaluation at the end of the fourth month and TVB-N contents exceeded eligible rate in the fourth month, shelf life of the products during frozen storage was set to be three months at -18°C. The results obtained from statistical tests indicate to better quality of breaded Kilka processed with tempura batter compared to simple batter in terms of organoleptic evaluation, spoilage indices, and high quality of fat in various sampling phases.
Design and implementation of the embedded capacitance layers for decoupling of wireless sensor nodes
Resumo:
In this paper, the embedded capacitance material (ECM) is fabricated between the power and ground layers of the wireless sensor nodes, forming an integrated capacitance to replace the large amount of decoupling capacitors on the board. The ECM material, whose dielectric constant is 16, has the same size of the wireless sensor nodes of 3cm*3cm, with a thickness of only 14μm. Though the capacitance of a single ECM layer being only around 8nF, there are two reasons the ECM layers can still replace the high frequency decoupling capacitors (100nF in our case) on the board. The first reason is: the parasitic inductance of the ECM layer is much lower than the surface mount capacitors'. A smaller capacitance value of the ECM layer could achieve the same resonant frequency of the surface mount decoupling capacitors. Simulation and measurement fit this assumption well. The second reason is: more than one layer of ECM material are utilized during the design step to get a parallel connection of the several ECM capacitance layers, finally leading to a larger value of the capacitance and smaller value of parasitic. Characterization of the ECM is carried out by the LCR meter. To evaluate the behaviors of the ECM layer, time and frequency domain measurements are performed on the power-bus decoupling of the wireless sensor nodes. Comparison with the measurements of bare PCB board and decoupling capacitors solution are provided to show the improvement of the ECM layer. Measurements show that the implementation of the ECM layer can not only save the space of the surface mount decoupling capacitors, but also provide better power-bus decoupling to the nodes.
Resumo:
RNA editing is a biological phenomena that alters nascent RNA transcripts by insertion, deletion and/or substitution of one or a few nucleotides. It is ubiquitous in all kingdoms of life and in viruses. The predominant editing event in organisms with a developed central nervous system is Adenosine to Inosine deamination. Inosine is recognized as Guanosine by the translational machinery and reverse-transcriptase. In primates, RNA editing occurs frequently in transcripts from repetitive regions of the genome. In humans, more than 500,000 editing instances have been identified, by applying computational pipelines on available ESTs and high-throughput sequencing data, and by using chemical methods. However, the functions of only a small number of cases have been studied thoroughly. RNA editing instances have been found to have roles in peptide variants synthesis by non-synonymous codon substitutions, transcript variants by alterations in splicing sites and gene silencing by miRNAs sequence modifications. We established the Database of RNA EDiting (DARNED) to accommo-date the reference genomic coordinates of substitution editing in human, mouse and fly transcripts from published literatures, with additional information on edited genomic coordinates collected from various databases e.g. UCSC, NCBI. DARNED contains mostly Adenosine to Inosine editing and allows searches based on genomic region, gene ID, and user provided sequence. The Database is accessible at http://darned.ucc.ie RNA editing instances in coding region are likely to result in recoding in protein synthesis. This encouraged me to focus my research on the occurrences of RNA editing specific CDS and non-Alu exonic regions. By applying various filters on discrepancies between available ESTs and their corresponding reference genomic sequences, putative RNA editing candidates were identified. High-throughput sequencing was used to validate these candidates. All predicted coordinates appeared to be either SNPs or unedited.
Resumo:
Purpose – The purpose of this paper is to identify, clarify and tabulate the various managerial issues encountered, to aid in the management of the complex health and safety concerns which occur within a confined construction site environment.
Design/methodology/approach – This is achieved through conducting extensive qualitative and qualitative research in the form of case studies, interviews and questionnaire survey.
Findings – The leading managerial issues in the management of health and safety on a confined construction site are found to be: “Difficulty to move materials around site safely”; “Lack of adequate room for the effective handling of materials”; “Difficulty in ensuring site is tidy and all plant and materials are stored safely”; “Close proximity of individuals to operation of large plant and machinery”; and joint fifth “Difficulty in ensuring proper arrangement and collection of waste materials on-site” along with “Difficulty in controlling hazardous materials and equipment on site”.
Practical implications – The resulting implication for practice of these results can be summarised by identifying that with sustained development of urban centres on a global scale, coupled with the increasing complexity of architectural designs, the majority of on-site project management professionals are faced with the onerous task of completing often intricate designs within a limited spatial environment, under strict health and safety parameters.
Originality/value – The subsequent value of the findings are such that just as on-site management professionals successfully identify the various managerial issues highlighted, the successful management of health and safety on a confined construction site is attainable.
Resumo:
The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.
Resumo:
Considering Alan Turing’s challenge in «Computing Machinery and Intelligence» (1950) – can machines play the «imitation game»? – it is proposed that the requirements of the Turing test are already implicitly being used for checking the credibility of virtual characters and avatars. Like characters, Avatars aim to visually express emotions (the exterior signs of the existence of feeling) and its creators have to resort to emotion codes. Traditional arts have profusely contributed for this field and, together with the science of anatomy, shaped the grounds for current Facial Action Coding System (FACS) and their databases. However, FACS researchers have to improve their «instruction tables» so that the machines will be able, in a near future, to be programmed to carry out the operation of recognizing human expressions (face and body) and classify them adequately. For the moment, the reproductions have to resort to the copy of real life expressions, and the presente smile of avatars comes from mirroring their human users.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
This work project focuses on developing new approaches which enhance Portuguese exports towards a defined German industry sector within the information technology and electronics fields. Firstly and foremost, information was collected and a set of expert and top managers’ interviews were performed in order to acknowledge the demand of the German market while identifying compatible Portuguese supply capabilities. Among the main findings, Industry 4.0 presents itself as a valuable opportunity in the German market for Portuguese medium sized companies in the embedded systems area of expertise for machinery and equipment companies. In order to achieve the purpose of the work project, an embedded systems platform targeting machinery and equipment companies was suggested as well as it was developed several recommendations on how to implement it. An alternative approach for this platform was also considered within the German market namely the eHealth sector having the purpose of enhancing the current healthcare service provision.
Resumo:
Six lefthanded artist-educators were interviewed to attempt to discover any patterns t6 their perceptions and experiences. Artists have their own culture and priorities. According to the literature, lefthanded people appear more likely to suffer from dyslexia, allergies, asthma and other auto-immune diseases as well as machinery and equipment injuries. Patterns emerging suggested that lefthanded people indeed suffer more from dyslexia. More startling was the distinct possibility that many artists have traumatic childhood histories. This would commonly include negative school experiences, and for a significant number sexual assault, perceived or actual abandonment by parents, and/or consistently low selfesteem. The researcher discovered possible reasons why creative people frequently have problems at school, why they tend to be rebellious and anti-establishment oriented, how many of them perceive societal rules, and why they are more likely to be lefthanded. These characteristics all have significant implications for art school administrators.
Resumo:
Attempts to reduce the energy consumed in UK homes have met with limited success. One reason for this is a lack of understanding of how people interact with domestic technology – heating systems, lights, electrical equipment and so forth. Attaining such an understanding is hampered by a chronic shortage of detailed energy use data matched to descriptions of the house, the occupants, the internal conditions and the installed services and appliances. Without such information it is impossible to produce transparent and valid models for understanding and predicting energy use. The Carbon Reduction in Buildings (CaRB) consortium of five UK universities plans to develop socio-technical models of energy use, underpinned by a flow of data from a longitudinal monitoring campaign involving several hundred UK homes. This paper outlines the models proposed, the preliminary monitoring work and the structure of the proposed longitudinal study.