891 resultados para on-line analysis of milk
Resumo:
Purpose: This study investigated the influence of long-term wearing of unstable shoes (WUS) on compensatory postural adjustments (CPA) to an external perturbation. Methods: Participants were divided into two groups: one wore unstable shoes while the other wore conventional shoes for 8 weeks. The ground reaction force signal was used to calculate the anterior– posterior (AP) displacement of the centre of pressure (CoP) and the electromyographic signal of gastrocnemius medialis (GM), tibialis anterior (TA), rectus femoris (RF) and biceps femoris (BF) muscles was used to assess individual muscle activity, antagonist co-activation and reciprocal activation at the joint (TA/GM and RF/(BF + GM) pairs) and muscle group levels (ventral (TA + RF)/dorsal (GM + BF) pair) within time intervals typical for CPA. The electromyographic signal was also used to assess muscle latency. The variables described were evaluated before and after the 8-week period while wearing the unstable shoes and barefoot. Results: Long-term WUS led to: an increase of BF activity in both conditions (barefoot and wearing the unstable shoes); a decrease of GM activity; an increase of antagonist co-activation and a decrease of reciprocal activation level at the TA/GM and ventral/dorsal pairs in the unstable shoe condition. Additionally, WUS led to a decrease in CoP displacement. However, no differences were observed in muscle onset and offset. Conclusion: Results suggest that the prolonged use of unstable shoes leads to increased ankle and muscle groups’ antagonist co-activation levels and higher performance by the postural control system.
Resumo:
Background: Diet and physical activity (PA) are recognized as important factors to prevent abdominal obesity (AO), which is strongly associated with chronic diseases. Some studies have reported an inverse association between milk consumption and AO. Objective: This study examined the association between milk intake, PA and AO in adolescents. Methods: A cross-sectional study was conducted with 1209 adolescents, aged 15–18 from the Azorean Archipelago, Portugal in 2008. AO was defined by a waist circumference at or above the 90th percentile. Adolescent food intake was measured using a semi-quantitative food frequency questionnaire, and milk intake was categorized as ‘low milk intake’ (<2 servings per day) or ‘high milk intake’ ( 2 servings per day). PA was assessed via a self-report questionnaire, and participants were divided into active (>10 points) and low-active groups ( 10 points) on the basis of their reported PA. They were then divided into four smaller groups, according to milk intake and PA: (i) low milk intake/low active; (ii) low milk intake/active; (iii) high milk intake/low active and (iv) high milk intake/active. The association between milk intake, PA and AO was evaluated using logistic regression analysis, and the results were adjusted for demographic, body mass index, pubertal stage and dietary confounders. Results: In this study, the majority of adolescents consumed semi-skimmed or skimmed milk (92.3%). The group of adolescents with high level of milk intake and active had a lower proportion of AO than did other groups (low milk intake/low active: 34.2%; low milk intake/active: 26.9%; high milk intake/low active: 25.7%; high milk intake/active: 21.9%, P = 0.008). After adjusting for confounders, low-active and active adolescents with high levels of milk intake were less likely to have AO, compared with low-active adolescents with low milk intake (high milk intake/low active, odds ratio [OR] = 0.412, 95% confidence intervals [CI]: 0.201– 0.845; high milk intake/active adolescents, OR = 0.445, 95% CI: 0.235–0.845).Conclusion: High milk intake seems to have a protective effect on AO, regardless of PA level
Resumo:
Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.
Resumo:
The use of multicores is becoming widespread inthe field of embedded systems, many of which have real-time requirements. Hence, ensuring that real-time applications meet their timing constraints is a pre-requisite before deploying them on these systems. This necessitates the consideration of the impact of the contention due to shared lowlevel hardware resources like the front-side bus (FSB) on the Worst-CaseExecution Time (WCET) of the tasks. Towards this aim, this paper proposes a method to determine an upper bound on the number of bus requests that tasks executing on a core can generate in a given time interval. We show that our method yields tighter upper bounds in comparison with the state of-the-art. We then apply our method to compute the extra contention delay incurred by tasks, when they are co-scheduled on different cores and access the shared main memory, using a shared bus, access to which is granted using a round-robin arbitration (RR) protocol.
Resumo:
The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.
Resumo:
Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. We propose an algorithm which can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are three times faster.
Resumo:
Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. On each processor, tasks are scheduled according to rate-monotonic. We propose an algorithm that can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are √2 / √2−1 ≈ 3.41 times faster. No such guarantees are previously known for partitioned static-priority scheduling on uniform multiprocessors.
Resumo:
Consider a multihop network comprising Ethernet switches. The traffic is described with flows and each flow is characterized by its source node, its destination node, its route and parameters in the generalized multiframe model. Output queues on Ethernet switches are scheduled by static-priority scheduling and tasks executing on the processor in an Ethernet switch are scheduled by stride scheduling. We present schedulability analysis for this setting.
Resumo:
Introduction: multimodality environment; requirement for greater understanding of the imaging technologies used, the limitations of these technologies, and how to best interpret the results; dose optimization; introduction of new techniques; current practice and best practice; incidental findings, in low-dose CT images obtained as part of the hybrid imaging process, are an increasing phenomenon with advancing CT technology; resultant ethical and medico-legal dilemmas; understanding limitations of these procedures important when reporting images and recommending follow-up; free-response observer performance study was used to evaluate lesion detection in low-dose CT images obtained during attenuation correction acquisitions for myocardial perfusion imaging, on two hybrid imaging systems.
Resumo:
On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
To boost logic density and reduce per unit power consumption SRAM-based FPGAs manufacturers adopted nanometric technologies. However, this technology is highly vulnerable to radiation-induced faults, which affect values stored in memory cells, and to manufacturing imperfections. Fault tolerant implementations, based on Triple Modular Redundancy (TMR) infrastructures, help to keep the correct operation of the circuit. However, TMR is not sufficient to guarantee the safe operation of a circuit. Other issues like module placement, the effects of multi- bit upsets (MBU) or fault accumulation, have also to be addressed. In case of a fault occurrence the correct operation of the affected module must be restored and/or the current state of the circuit coherently re-established. A solution that enables the autonomous restoration of the functional definition of the affected module, avoiding fault accumulation, re-establishing the correct circuit state in real-time, while keeping the normal operation of the circuit, is presented in this paper.
Resumo:
Glioma is the most frequent form of malignant brain tumor in the adults and childhood. There is a global tendency toward a higher incidence of gliomas in highly developed and industrialized countries. Simultaneously obesity is reaching epidemic proportions in such developed countries. It has been highly accepted that obesity may play an important role in the biology of several types of cancer. We have developed an in vitro method for the understanding of the influence of obesity on glioma mouse cells (Gl261). 3T3-L1 mouse pre-adipocytes were induced to the maturity. The conditioned medium was harvested and used into the Gl261 cultures. Using two-dimension electrophoresis it was analyzed the proteome content of Gl261 in the presence of conditioned medium (CGl) and in its absence (NCGl). The differently expressed spots were collected and analyzed by means of mass spectroscopy (MALDI-TOF-MS). Significantly expression pattern changes were observed in eleven proteins and enzymes. RFC1, KIF5C, ANXA2, N-RAP, RACK1 and citrate synthase were overexpressed or only present in the CGl. Contrariwise, STI1, hnRNPs and phosphoglycerate kinase 1 were significantly underexpressed in CGl. Aldose reductase and carbonic anhydrase were expressed only in NCGl. Our results show that obesity remodels the physiological and metabolic behavior of glioma cancer cells. Also, proteins found differently expressed are implicated in several signaling pathways that control matrix remodeling, proliferation, progression, migration and invasion. In general our results support the idea that obesity may increase glioma malignancy, however, some interesting paradox finding were also reported and discussed.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.