964 resultados para Graders (Earthmoving machinery)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on naïve physics investigates children’s intuitive understanding of physical objects, phenomena and processes. Children, and also many adults, were found to have a misconception of inertia, called impetus theory. In order to investigate the development of this naïve concept and the mechanism underlying it, four age groups (5-year-olds, 2nd graders, 5th graders, and 8th graders) were included in this research. Modified experimental tasks were used to explore the effects of daily experience, perceptual cues and general information-processing ability on children’s understanding of inertia. The results of this research are: 1) Five- to thirteen-year-olds’ understanding of inertia problems which were constituted by two ogjects moving at the same spped undergoes an L-shaped developmental trend; Children’s performance became worse as they got older, and their performance in the experiment did not necessarily ascend with the improvement of their cognitive abilities. 2) The L-shaped developmental curve suggests that children in different ages used different strategies to solve inertia problems: Five- to eight-year-olds only used heuristic strategy, while eleven- to thirteen-year-olds solved problems by analyzing the details of inertia motion. 3) The different performance between familiar and unfamiliar problems showed that older children were not able to spontaneously transfer their knowledge and experience from daily action and observation of inertia to unfamiliar, abstract inertia problems. 4) Five- to eight-year-olds showed straight and fragmented pattern, while more eleven- to thirteen-year-olds showed standard impetus theory and revised impetus theory pattern, which showed that younger children were influenced by perceptual cues and their understanding of inertia was fragmented, while older children had coherent impetus theory. 5) When the perceptual cues were controlled, even 40 percent 5 years olds showed the information-processing ability to analyze the distance, speed and time of two objects traveling in two different directions at the same time, demonstrating that they have achieved a necessary level to theorize their naïve concept of inertia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As former research shown, the error rate of consistent compare word problem (such as: Mary has 5 apples, Tom has 2 apples more than Mary, how many apples does Tom has?) is much lower than the rate of inconsistent compare word problem (such as: Mary has 5 apples, she has 2 apples more than Tom, how many apples does Tom has?). This difference of error rate is named as Consistency Effect. There are different explanations about why consistency effect happens, one of them is R.E.Mayer's opinion about two kinds of problem solving strategies. As Mayer's opinion, unsuccessful problem solvers make mistakes on inconsistent problem because they use Direct Translation Strategy, problem solvers who use Problem Model Strategy will not make such kind of mistakes. In this study, three experiments with 3~(rd) graders investigate reasons for the consistency effect. The results of experiment 1 do not support the explanation of Mayer's theory of two kinds of strategy. Experiment 2 shows that there is no relation between inconsistent problem error and Impulsivity cognitive style. Experiment 3 reveals that the working memory capacity of successful inconsistent problem solvers is significant larger than the capacity of unsuccessful solvers. It is supposed that working memory could be an important factor contributing for the consistency effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expert systems are too slow. This work attacks that problem by speeding up a useful system component that remembers facts and tracks down simple consequences. The redesigned component can assimilate new facts more quickly because it uses a compact, grammar-based internal representation to deal with whole classes of equivalent expressions at once. It can support faster hypothetical reasoning because it remembers the consequences of several assumption sets at once. The new design is targeted for situations in which many of the stored facts are equalities. The deductive machinery considered here supplements stored premises with simple new conclusions. The stored premises include permanently asserted facts and temporarily adopted assumptions. The new conclusions are derived by substituting equals for equals and using the properties of the logical connectives AND, Or, and NOT. The deductive system provides supporting premises for its derived conclusions. Reasoning that involves quantifiers is beyond the scope of its limited and automatic operation. The expert system of which the reasoning system is a component is expected to be responsible for overall control of reasoning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the project is to research the shape and influence of religion and spirituality in the lives of U.S. adolescents; to identify effective practices in the religious, moral, and social formation of the lives of youth; to describe the extent to which youth participate in and benefit from the programs and opportunities that religious communities are offering to their youth; and to foster an informed national discussion about the influence of religion in youth's lives, in order to encourage sustained reflection about and rethinking of our cultural and institutional practices with regard to youth and religion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the project is to research the shape and influence of religion and spirituality in the lives of U.S. adolescents; to identify effective practices in the religious, moral, and social formation of the lives of youth; to describe the extent to which youth participate in and benefit from the programs and opportunities that religious communities are offering to their youth; and to foster an informed national discussion about the influence of religion in youth's lives, in order to encourage sustained reflection about and rethinking of our cultural and institutional practices with regard to youth and religion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: To date, there is limited research examining sleep patterns in elementary school children. Previous researchers focused on parental responses rather than student responses to determine factors that affect sleep. The presented study surveyed sleep patterns and examined external factors affecting total sleep time among elementary school children and adolescents. Methods: Students in grades 2-5 (n=885) and grade 10 (n=190) enrolled in a public school system in the Northeast, completed a district administered survey that included questions on sleep duration and hygiene. Results. Average reported sleep duration decreased with increasing grade level. Children in grades 2-5 woke up earlier (31.7-72.4%) and on their own in comparison to adolescents in grade 10 (6.8%). Significantly shorter sleep durations were associated with having a television (grades 2, 4, 5, p< 0.01) or a cell phone in the room (grades 3, 4; p < 0.05), playing on the computer or video games (grades 3, 4, p<.001) before going to bed. In contrast, students in grade 2, 3, & 4 who reported reading a book before going to bed slept on average 21 minutes more per night (p=.029, .007, .009, respectively). For tenth graders, only consumption of energy drinks led to significant reduction in sleep duration (p<.0001). Conclusion. Sleep is a fundamental aspect in maintaining a healthy and adequate life style. Understanding sleep patterns will assist parents, health care providers, and educators in promoting quality sleep hygiene in school-aged children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RNA editing is a biological phenomena that alters nascent RNA transcripts by insertion, deletion and/or substitution of one or a few nucleotides. It is ubiquitous in all kingdoms of life and in viruses. The predominant editing event in organisms with a developed central nervous system is Adenosine to Inosine deamination. Inosine is recognized as Guanosine by the translational machinery and reverse-transcriptase. In primates, RNA editing occurs frequently in transcripts from repetitive regions of the genome. In humans, more than 500,000 editing instances have been identified, by applying computational pipelines on available ESTs and high-throughput sequencing data, and by using chemical methods. However, the functions of only a small number of cases have been studied thoroughly. RNA editing instances have been found to have roles in peptide variants synthesis by non-synonymous codon substitutions, transcript variants by alterations in splicing sites and gene silencing by miRNAs sequence modifications. We established the Database of RNA EDiting (DARNED) to accommo-date the reference genomic coordinates of substitution editing in human, mouse and fly transcripts from published literatures, with additional information on edited genomic coordinates collected from various databases e.g. UCSC, NCBI. DARNED contains mostly Adenosine to Inosine editing and allows searches based on genomic region, gene ID, and user provided sequence. The Database is accessible at http://darned.ucc.ie RNA editing instances in coding region are likely to result in recoding in protein synthesis. This encouraged me to focus my research on the occurrences of RNA editing specific CDS and non-Alu exonic regions. By applying various filters on discrepancies between available ESTs and their corresponding reference genomic sequences, putative RNA editing candidates were identified. High-throughput sequencing was used to validate these candidates. All predicted coordinates appeared to be either SNPs or unedited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vascular smooth muscle cells (VSMC) are one of the key players in the pathogenesis of cardiovascular diseases. The origin of neointimal VSMC has thus become a prime focus of research. VSMC originate from multiple progenitors cell types. In embryo the well-defined sources of VSMC include; neural crest cells, proepicardial cells and EPC. In adults, though progenitor cells from bone marrow (BM), circulation and tissues giving rise to SMC have been identified, no progress has been made in terms of isolating highly proliferative clonal population of adult stem cells with potential to differentiate into SMC. Smooth muscle like stem progenitor cells (SMSPC) were isolated from cardiopulmonary bypass filters of adult patients undergoing CABG. Rat SMSPC have previously been isolated by our group from the bone marrow of Fischer rats and also from the peripheral blood of monocrotaline induced pulmonary hypertension (MCT-PHTN) animal model. Characterization of novel SMSPC exhibited stem cell characteristics and machinery for differentiation into SMC. The expression of Isl-1 on SMSPC provided unique molecular identity to these circulating stem progenitor cells. The functional potential of SMSPC was determined by monitoring adoptive transfer of GFP+ SMSPC in rodent models of vascular injury; carotid injury and MCT-PHTN. The participation of SMSPC in vascular pathology was confirmed by quantifying the peripheral blood, and engrafted levels of SMSPC using RT-PCR. In terms of translating into clinical practice, SMSPC could be a good tool for detecting the atherosclerotic plaque burden. The current study demonstrates the existence of novel adult stem progenitor cells in circulation, with the potential role in vascular pathology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Autophagy has been predominantly studied as a nonselective self-digestion process that recycles macromolecules and produces energy in response to starvation. However, autophagy independent of nutrient status has long been known to exist. Recent evidence suggests that this form of autophagy enforces intracellular quality control by selectively disposing of aberrant protein aggregates and damaged organelles--common denominators in various forms of neurodegenerative diseases. By definition, this form of autophagy, termed quality-control (QC) autophagy, must be different from nutrient-regulated autophagy in substrate selectivity, regulation and function. We have recently identified the ubiquitin-binding deacetylase, HDAC6, as a key component that establishes QC. HDAC6 is not required for autophagy activation per se; rather, it is recruited to ubiquitinated autophagic substrates where it stimulates autophagosome-lysosome fusion by promoting F-actin remodeling in a cortactin-dependent manner. Remarkably, HDAC6 and cortactin are dispensable for starvation-induced autophagy. These findings reveal that autophagosomes associated with QC are molecularly and biochemically distinct from those associated with starvation autophagy, thereby providing a new molecular framework to understand the emerging complexity of autophagy and therapeutic potential of this unique machinery.