918 resultados para custom
Resumo:
Zygmunt Ziembiński was one of the most prominent theoreticians of law in Poland in the second half of the 20th century. He developed an original theory of law defined as a theory of legal phenomena, which covered both logical-linguistic as well as real aspects of law. The theory served as a base for the development of a unique so-called advanced normative conception of sources of law, one of the greatest achievements of theory of law in Poland. This conception encompasses all the indispensable elements of a coherent system of binding legal norms: 1) indication of a political justification (ideological assumptions) of the entire system of law; 2) pre-judgment of law-making competence of government agencies; 3) determination of the status of custom and precedent; 4) compilation of a catalogue of permissible interpretation rules; 5) compilation of a catalogue of permissible inferential rules (permissible rules of legal inferences); 6) compilation of a catalogue of permissible collision rules.
Resumo:
Human and robots have complementary strengths in performing assembly operations. Humans are very good at perception tasks in unstructured environments. They are able to recognize and locate a part from a box of miscellaneous parts. They are also very good at complex manipulation in tight spaces. The sensory characteristics of the humans, motor abilities, knowledge and skills give the humans the ability to react to unexpected situations and resolve problems quickly. In contrast, robots are very good at pick and place operations and highly repeatable in placement tasks. Robots can perform tasks at high speeds and still maintain precision in their operations. Robots can also operate for long periods of times. Robots are also very good at applying high forces and torques. Typically, robots are used in mass production. Small batch and custom production operations predominantly use manual labor. The high labor cost is making it difficult for small and medium manufacturers to remain cost competitive in high wage markets. These manufactures are mainly involved in small batch and custom production. They need to find a way to reduce the labor cost in assembly operations. Purely robotic cells will not be able to provide them the necessary flexibility. Creating hybrid cells where humans and robots can collaborate in close physical proximities is a potential solution. The underlying idea behind such cells is to decompose assembly operations into tasks such that humans and robots can collaborate by performing sub-tasks that are suitable for them. Realizing hybrid cells that enable effective human and robot collaboration is challenging. This dissertation addresses the following three computational issues involved in developing and utilizing hybrid assembly cells: - We should be able to automatically generate plans to operate hybrid assembly cells to ensure efficient cell operation. This requires generating feasible assembly sequences and instructions for robots and human operators, respectively. Automated planning poses the following two challenges. First, generating operation plans for complex assemblies is challenging. The complexity can come due to the combinatorial explosion caused by the size of the assembly or the complex paths needed to perform the assembly. Second, generating feasible plans requires accounting for robot and human motion constraints. The first objective of the dissertation is to develop the underlying computational foundations for automatically generating plans for the operation of hybrid cells. It addresses both assembly complexity and motion constraints issues. - The collaboration between humans and robots in the assembly cell will only be practical if human safety can be ensured during the assembly tasks that require collaboration between humans and robots. The second objective of the dissertation is to evaluate different options for real-time monitoring of the state of human operator with respect to the robot and develop strategies for taking appropriate measures to ensure human safety when the planned move by the robot may compromise the safety of the human operator. In order to be competitive in the market, the developed solution will have to include considerations about cost without significantly compromising quality. - In the envisioned hybrid cell, we will be relying on human operators to bring the part into the cell. If the human operator makes an error in selecting the part or fails to place it correctly, the robot will be unable to correctly perform the task assigned to it. If the error goes undetected, it can lead to a defective product and inefficiencies in the cell operation. The reason for human error can be either confusion due to poor quality instructions or human operator not paying adequate attention to the instructions. In order to ensure smooth and error-free operation of the cell, we will need to monitor the state of the assembly operations in the cell. The third objective of the dissertation is to identify and track parts in the cell and automatically generate instructions for taking corrective actions if a human operator deviates from the selected plan. Potential corrective actions may involve re-planning if it is possible to continue assembly from the current state. Corrective actions may also involve issuing warning and generating instructions to undo the current task.
Resumo:
Spasticity is a common disorder in people who have upper motor neuron injury. The involvement may occur at different levels. The Modified Ashworth Scale (MAS) is the most used method to measure involvement levels. But it corresponds to a subjective evaluation. Mechanomyography (MMG) is an objective technique that quantifies the muscle vibration during the contraction and stretching events. So, it may assess the level of spasticity accurately. This study aimed to investigate the correlation between spasticity levels determined by MAS with MMG signal in spastic and not spastic muscles. In the experimental protocol, we evaluated 34 members of 22 volunteers, of both genders, with a mean age of 39.91 ± 13.77 years. We evaluated the levels of spasticity by MAS in flexor and extensor muscle groups of the knee and/or elbow, where one muscle group was the agonist and one antagonist. Simultaneously the assessment by the MAS, caught up the MMG signals. We used a custom MMG equipment to register and record the signals, configured in LabView platform. Using the MatLab computer program, it was processed the MMG signals in the time domain (median energy) and spectral domain (median frequency) for the three motion axes: X (transversal), Y (longitudinal) and Z (perpendicular). For bandwidth delimitation, we used a 3rd order Butterworth filter, acting in the range of 5-50 Hz. Statistical tests as Spearman's correlation coefficient, Kruskal-Wallis test and linear correlation test were applied. As results in the time domain, the Kruskal-Wallis test showed differences in median energy (MMGME) between MAS groups. The linear correlation test showed high linear correlation between MAS and MMGME for the agonist muscle as well as for the antagonist group. The largest linear correlation occurred between the MAS and MMG ME for the Z axis of the agonist muscle group (R2 = 0.9557) and the lowest correlation occurred in the X axis, for the antagonist muscle group (R2 = 0.8862). The Spearman correlation test also confirmed high correlation for all axes in the time domain analysis. In the spectral domain, the analysis showed an increase in the median frequency (MMGMF) in MAS’ greater levels. The highest correlation coefficient between MAS and MMGMF signal occurred in the Z axis for the agonist muscle group (R2 = 0.4883), and the lowest value occurred on the Y axis for the antagonist group (R2 = 0.1657). By means of the Spearman correlation test, the highest correlation occurred between the Y axis of the agonist group (0.6951; p <0.001) and the lowest value on the X axis of the antagonist group (0.3592; p <0.001). We conclude that there was a significantly high correlation between the MMGME and MAS in both muscle groups. Also between MMG and MAS occurred a significant correlation, however moderate for the agonist group, and low for the antagonist group. So, the MMGME proved to be more an appropriate descriptor to correlate with the degree of spasticity defined by the MAS.
Resumo:
La pintura de paisaje surge como corriente pictórica a finales del siglo XIX como un producto de una suma de intereses tanto académicos como científicos que desembocan en el interés por la naturaleza. Se enmarca dentro de un pensamiento político que sitúa a nuestro país en las expectativas de una nueva estructura socio cultural que pone énfasis en la libertad y en los derechos humanos, el derecho a la propiedad privada y sobre todo abre sus horizontes a la integración social y cultural, se ve la necesidad de comunicar e inspirarse en la propia tierra. En ésta investigación se pretende inquirir en los diferentes procesos que experimentan los artistas al contacto con la naturaleza; que se interiorizan a través de las distintas experiencias que tienen con las técnicas de proceso artístico, dentro de las cuales se capturan la luz, el espacio, la cromática, la vivencia que capta el espectador de las obras y las expectativas que tiene del artista. Este proceso puede o no ser artístico: algunas veces guiado por el academicismo, otras veces por encargo, con expectativas que muchas veces tienen fines políticos (poniendo en desmedro del valor académico, el tema). Obras que se traducen en una exigencia de la técnica, ya que se trata de captar la naturaleza, que de por sí es perfecta, en un prolijo uso casi perfeccionista de la misma, sin llegar a comprender que los individuos como las huellas dactilares, son diferentes en su interior; por lo que captarán la esencia de la naturaleza de acuerdo a las numerosas experiencias que hayan tenido con ella. La idea de pintar paisaje ya no está relacionada solamente con la observación sino con la perspectiva de representar el entorno de acuerdo a la manera muy particular y especial de cada artista.
Epidemiology and genetic architecture of blood pressure: a family based study of Generation Scotland
Resumo:
Hypertension is a major risk factor for cardiovascular disease and mortality, and a growing global public health concern, with up to one-third of the world’s population affected. Despite the vast amount of evidence for the benefits of blood pressure (BP) lowering accumulated to date, elevated BP is still the leading risk factor for disease and disability worldwide. It is well established that hypertension and BP are common complex traits, where multiple genetic and environmental factors contribute to BP variation. Furthermore, family and twin studies confirmed the genetic component of BP, with a heritability estimate in the range of 30-50%. Contemporary genomic tools enabling the genotyping of millions of genetic variants across the human genome in an efficient, reliable, and cost-effective manner, has transformed hypertension genetics research. This is accompanied by the presence of international consortia that have offered unprecedentedly large sample sizes for genome-wide association studies (GWASs). While GWAS for hypertension and BP have identified more than 60 loci, variants in these loci are associated with modest effects on BP and in aggregate can explain less than 3% of the variance in BP. The aims of this thesis are to study the genetic and environmental factors that influence BP and hypertension traits in the Scottish population, by performing several genetic epidemiological analyses. In the first part of this thesis, it aims to study the burden of hypertension in the Scottish population, along with assessing the familial aggregation and heritialbity of BP and hypertension traits. In the second part, it aims to validate the association of common SNPs reported in the large GWAS and to estimate the variance explained by these variants. In this thesis, comprehensive genetic epidemiology analyses were performed on Generation Scotland: Scottish Family Health Study (GS:SFHS), one of the largest population-based family design studies. The availability of clinical, biological samples, self-reported information, and medical records for study participants has allowed several assessments to be performed to evaluate factors that influence BP variation in the Scottish population. Of the 20,753 subjects genotyped in the study, a total of 18,470 individuals (grouped into 7,025 extended families) passed the stringent quality control (QC) criteria and were available for all subsequent analysis. Based on the BP-lowering treatment exposure sources, subjects were further classified into two groups. First, subjects with both a self-reported medications (SRMs) history and electronic-prescription records (EPRs; n =12,347); second, all the subjects with at least one medication history source (n =18,470). In the first group, the analysis showed a good concordance between SRMs and EPRs (kappa =71%), indicating that SRMs can be used as a surrogate to assess the exposure to BP-lowering medication in GS:SFHS participants. Although both sources suffer from some limitations, SRMs can be considered the best available source to estimate the drug exposure history in those without EPRs. The prevalence of hypertension was 40.8% with higher prevalence in men (46.3%) compared to women (35.8%). The prevalence of awareness, treatment and controlled hypertension as defined by the study definition were 25.3%, 31.2%, and 54.3%, respectively. These findings are lower than similar reported studies in other populations, with the exception of controlled hypertension prevalence, which can be considered better than other populations. Odds of hypertension were higher in men, obese or overweight individuals, people with a parental history of hypertension, and those living in the most deprived area of Scotland. On the other hand, deprivation was associated with higher odds of treatment, awareness and controlled hypertension, suggesting that people living in the most deprived area may have been receiving better quality of care, or have higher comorbidity levels requiring greater engagement with doctors. These findings highlight the need for further work to improve hypertension management in Scotland. The family design of GS:SFHS has allowed family-based analysis to be performed to assess the familial aggregation and heritability of BP and hypertension traits. The familial correlation of BP traits ranged from 0.07 to 0.20, and from 0.18 to 0.34 for parent-offspring pairs and sibling pairs, respectively. A higher correlation of BP traits was observed among first-degree relatives than other types of relative pairs. A variance-component model that was adjusted for sex, body mass index (BMI), age, and age-squared was used to estimate heritability of BP traits, which ranged from 24% to 32% with pulse pressure (PP) having the lowest estimates. The genetic correlation between BP traits showed a high correlation between systolic (SBP), diastolic (DBP) and mean arterial pressure (MAP) (G: 81% to 94%), but lower correlations with PP (G: 22% to 78%). The sibling recurrence risk ratio (λS) for hypertension and treatment were calculated as 1.60 and 2.04 respectively. These findings confirm the genetic components of BP traits in GS:SFHS, and justify further work to investigate genetic determinants of BP. Genetic variants reported in the recent large GWAS of BP traits were selected for genotyping in GS:SFHS using a custom designed TaqMan® OpenArray®. The genotyping plate included 44 single nucleotide polymorphisms (SNPs) that have been previously reported to be associated with BP or hypertension at genome-wide significance level. A linear mixed model that is adjusted for age, age-squared, sex, and BMI was used to test for the association between the genetic variants and BP traits. Of the 43 variants that passed the QC, 11 variants showed statistically significant association with at least one BP trait. The phenotypic variance explained by these variant for the four BP traits were 1.4%, 1.5%, 1.6%, and 0.8% for SBP, DBP, MAP, and PP, respectively. The association of genetic risk score (GRS) that were constructed from selected variants has showed a positive association with BP level and hypertension prevalence, with an average effect of one mmHg increase with each 0.80 unit increases in the GRS across the different BP traits. The impact of BP-lowering medication on the genetic association study for BP traits has been established, with typical practice of adding a fixed value (i.e. 15/10 mmHg) to the measured BP values to adjust for BP treatment. Using the subset of participants with the two treatment exposure sources (i.e. SRMs and EPRs), the influence of using either source to justify the addition of fixed values in SNP association signal was analysed. BP phenotypes derived from EPRs were considered the true phenotypes, and those derived from SRMs were considered less accurate, with some phenotypic noise. Comparing SNPs association signals between the four BP traits in the two model derived from the different adjustments showed that MAP was the least impacted by the phenotypic noise. This was suggested by identifying the same overlapped significant SNPs for the two models in the case of MAP, while other BP traits had some discrepancy between the two sources
Resumo:
The next generation of vehicles will be equipped with automated Accident Warning Systems (AWSs) capable of warning neighbouring vehicles about hazards that might lead to accidents. The key enabling technology for these systems is the Vehicular Ad-hoc Networks (VANET) but the dynamics of such networks make the crucial timely delivery of warning messages challenging. While most previously attempted implementations have used broadcast-based data dissemination schemes, these do not cope well as data traffic load or network density increases. This problem of sending warning messages in a timely manner is addressed by employing a network coding technique in this thesis. The proposed NETwork COded DissEmination (NETCODE) is a VANET-based AWS responsible for generating and sending warnings to the vehicles on the road. NETCODE offers an XOR-based data dissemination scheme that sends multiple warning in a single transmission and therefore, reduces the total number of transmissions required to send the same number of warnings that broadcast schemes send. Hence, it reduces contention and collisions in the network improving the delivery time of the warnings. The first part of this research (Chapters 3 and 4) asserts that in order to build a warning system, it is needful to ascertain the system requirements, information to be exchanged, and protocols best suited for communication between vehicles. Therefore, a study of these factors along with a review of existing proposals identifying their strength and weakness is carried out. Then an analysis of existing broadcast-based warning is conducted which concludes that although this is the most straightforward scheme, loading can result an effective collapse, resulting in unacceptably long transmission delays. The second part of this research (Chapter 5) proposes the NETCODE design, including the main contribution of this thesis, a pair of encoding and decoding algorithms that makes the use of an XOR-based technique to reduce transmission overheads and thus allows warnings to get delivered in time. The final part of this research (Chapters 6--8) evaluates the performance of the proposed scheme as to how it reduces the number of transmissions in the network in response to growing data traffic load and network density and investigates its capacity to detect potential accidents. The evaluations use a custom-built simulator to model real-world scenarios such as city areas, junctions, roundabouts, motorways and so on. The study shows that the reduction in the number of transmissions helps reduce competition in the network significantly and this allows vehicles to deliver warning messages more rapidly to their neighbours. It also examines the relative performance of NETCODE when handling both sudden event-driven and longer-term periodic messages in diverse scenarios under stress caused by increasing numbers of vehicles and transmissions per vehicle. This work confirms the thesis' primary contention that XOR-based network coding provides a potential solution on which a more efficient AWS data dissemination scheme can be built.
Resumo:
Tässä diplomityössä tarkasteltiin asiakasohjautuvan hitsaustuotannon kehittämistä PK-konepajassa. Työn tarkoituksena oli kerätä taustatietoa asiakasohjautuvan hitsaustuotannon erityispiirteistä ja tarkastella hitsaustuotannon kehittämistä. Työssä tutustuttiin hitsausprosesseihin ja niiden kehitysversioihin sekä hitsausaineisiin. Tärkeänä osa-alueena olivat hitsin ja hitsaustuotannon laadun ja laadunhallinnan sekä hitsauksen mekanisoinnin ja automatisoinnin tarkastelu, hitsauksen työsuojelua ja hitsausergonomiaa unohtamatta. Tämän jälkeen kartoitettiin yrityksen nykytilanne ja laadittiin kehittämistoimenpiteet sekä tarvittavat laskelmat ja kehitystyön implementointi. Sen jälkeen vertailtiin tehostettua toimintaa vanhaan toimintaan sekä tehtiin johtopäätökset toiminnan kehittämisestä.
Resumo:
In the context of this work we evaluated a multisensory, noninvasive prototype platform for shake flask cultivations by monitoring three basic parameters (pH, pO2 and biomass). The focus lies on the evaluation of the biomass sensor based on backward light scattering. The application spectrum was expanded to four new organisms in addition to E. coli K12 and S. cerevisiae [1]. It could be shown that the sensor is appropriate for a wide range of standard microorganisms, e.g., L. zeae, K. pastoris, A. niger and CHO-K1. The biomass sensor signal could successfully be correlated and calibrated with well-known measurement methods like OD600, cell dry weight (CDW) and cell concentration. Logarithmic and Bleasdale-Nelder derived functions were adequate for data fitting. Measurements at low cell concentrations proved to be critical in terms of a high signal to noise ratio, but the integration of a custom made light shade in the shake flask improved these measurements significantly. This sensor based measurement method has a high potential to initiate a new generation of online bioprocess monitoring. Metabolic studies will particularly benefit from the multisensory data acquisition. The sensor is already used in labscale experiments for shake flask cultivations.
Resumo:
Group living animals may eavesdrop on signalling interactions between conspecifics and integrate it with their own past social experience in order to optimize the use of relevant information from others. However, little is known about this interplay between public (eavesdropped) and private social information. To investigate it, we first manipulated the dominance status of bystander zebrafish. Next, we either allowed or prevented bystanders from observing a fight. Finally, we assessed their behaviour towards the winners and losers of the interaction, using a custom-made video-tracking system and directional analysis. We found that only dominant bystanders who had seen the fight revealed a significant increase in directional focus (a measure of attention) towards the losers of the fights. Furthermore, our results indicate that information about the fighters' acquired status was collected from the signalling interaction itself and not from post-interaction status cues, which implies the existence of individual recognition in zebrafish. Thus, we show for the first time that zebrafish, a highly social model organism, eavesdrop on conspecific agonistic interactions and that this process is modulated by the eavesdroppers' dominance status. We suggest that this type of integration of public and private information may be ubiquitous in social learning processes.
Dinoflagellate Genomic Organization and Phylogenetic Marker Discovery Utilizing Deep Sequencing Data
Resumo:
Dinoflagellates possess large genomes in which most genes are present in many copies. This has made studies of their genomic organization and phylogenetics challenging. Recent advances in sequencing technology have made deep sequencing of dinoflagellate transcriptomes feasible. This dissertation investigates the genomic organization of dinoflagellates to better understand the challenges of assembling dinoflagellate transcriptomic and genomic data from short read sequencing methods, and develops new techniques that utilize deep sequencing data to identify orthologous genes across a diverse set of taxa. To better understand the genomic organization of dinoflagellates, a genomic cosmid clone of the tandemly repeated gene Alchohol Dehydrogenase (AHD) was sequenced and analyzed. The organization of this clone was found to be counter to prevailing hypotheses of genomic organization in dinoflagellates. Further, a new non-canonical splicing motif was described that could greatly improve the automated modeling and annotation of genomic data. A custom phylogenetic marker discovery pipeline, incorporating methods that leverage the statistical power of large data sets was written. A case study on Stramenopiles was undertaken to test the utility in resolving relationships between known groups as well as the phylogenetic affinity of seven unknown taxa. The pipeline generated a set of 373 genes useful as phylogenetic markers that successfully resolved relationships among the major groups of Stramenopiles, and placed all unknown taxa on the tree with strong bootstrap support. This pipeline was then used to discover 668 genes useful as phylogenetic markers in dinoflagellates. Phylogenetic analysis of 58 dinoflagellates, using this set of markers, produced a phylogeny with good support of all branches. The Suessiales were found to be sister to the Peridinales. The Prorocentrales formed a monophyletic group with the Dinophysiales that was sister to the Gonyaulacales. The Gymnodinales was found to be paraphyletic, forming three monophyletic groups. While this pipeline was used to find phylogenetic markers, it will likely also be useful for finding orthologs of interest for other purposes, for the discovery of horizontally transferred genes, and for the separation of sequences in metagenomic data sets.
Resumo:
At the present there is a high pressure toward the improvement of all production processes. Those improvements can target distinct factors along the production chain. In particular, and due to recent tight energy efficiency policies, those that involve energy efficiency. As can be expected, agricultural processes are not immune to this tendency. Even more when dealing with indoor productions. In this context, this work presents an innovative system that aims to improve the energy efficiency of a trees growing platform. This improvement in energy consumption is accomplished by replacing an electric heating system by one based on thermodynamic panels. The assessment of the heating fluid caudal and its temperature was experimentally obtained by means of a custom made scaled prototype whose actuators status are commanded by a Fuzzy-based controller. The obtained results suggest that the change in the heating paradigm will lead to overall savings that can easily reach 60% on the energy bill.
Resumo:
Part 20: Health and Care Networks
Resumo:
O livro tem como fio condutor a ação docente na sua relação com os níveis (macro, meso e micro) em que ela se desenvolve, com a formação profissional que visa dar resposta às distintas demandas da educação escolar, com práticas de ensino e com modos de organização do trabalho docente. O livro integra doze textos de diferentes autores com vinculações institucionais diferentes e diversas (a universidade e a escola) e aborda as problemáticas da governação das escolas, dos professores, da sua formação inicial e continuada para o desenvolvimento de competências profissionais e das práticas de gestão pedagógica e curricular.
Resumo:
Biofilms are the primary cause of clinical bacterial infections and are impervious to typical amounts of antibiotics, necessitating very high doses for treatment. Therefore, it is highly desirable to develop new alternate methods of treatment that can complement or replace existing approaches using significantly lower doses of antibiotics. Current standards for studying biofilms are based on end-point studies that are invasive and destroy the biofilm during characterization. This dissertation presents the development of a novel real-time sensing and treatment technology to aid in the non-invasive characterization, monitoring and treatment of bacterial biofilms. The technology is demonstrated through the use of a high-throughput bifurcation based microfluidic reactor that enables simulation of flow conditions similar to indwelling medical devices. The integrated microsystem developed in this work incorporates the advantages of previous in vitro platforms while attempting to overcome some of their limitations. Biofilm formation is extremely sensitive to various growth parameters that cause large variability in biofilms between repeated experiments. In this work we investigate the use of microfluidic bifurcations for the reduction in biofilm growth variance. The microfluidic flow cell designed here spatially sections a single biofilm into multiple channels using microfluidic flow bifurcation. Biofilms grown in the bifurcated device were evaluated and verified for reduced biofilm growth variance using standard techniques like confocal microscopy. This uniformity in biofilm growth allows for reliable comparison and evaluation of new treatments with integrated controls on a single device. Biofilm partitioning was demonstrated using the bifurcation device by exposing three of the four channels to various treatments. We studied a novel bacterial biofilm treatment independent of traditional antibiotics using only small molecule inhibitors of bacterial quorum sensing (analogs) in combination with low electric fields. Studies using the bifurcation-based microfluidic flow cell integrated with real-time transduction methods and macro-scale end-point testing of the combination treatment showed a significant decrease in biomass compared to the untreated controls and well-known treatments such as antibiotics. To understand the possible mechanism of action of electric field-based treatments, fundamental treatment efficacy studies focusing on the effect of the energy of the applied electrical signal were performed. It was shown that the total energy and not the type of the applied electrical signal affects the effectiveness of the treatment. The linear dependence of the treatment efficacy on the applied electrical energy was also demonstrated. The integrated bifurcation-based microfluidic platform is the first microsystem that enables biofilm growth with reduced variance, as well as continuous real-time threshold-activated feedback monitoring and treatment using low electric fields. The sensors detect biofilm growth by monitoring the change in impedance across the interdigitated electrodes. Using the measured impedance change and user inputs provided through a convenient and simple graphical interface, a custom-built MATLAB control module intelligently switches the system into and out of treatment mode. Using this self-governing microsystem, in situ biofilm treatment based on the principles of the bioelectric effect was demonstrated by exposing two of the channels of the integrated bifurcation device to low doses of antibiotics.