942 resultados para Linear Static Analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Similar to parent support in the home environment, teacher support at school may positively influence children's fruit and vegetable (FV) consumption. This study assessed the relationship between teacher support for FV consumption and the FV intake of 4th and 5th grade students in low-income elementary schools in central Texas. Methods. A secondary analysis was performed on baseline data collected from 496 parent-child dyads during the Marathon Kids study carried out by the Michael & Susan Dell Center for Healthy Living at the University of Texas School of Public Health. A hierarchical linear regression analysis adjusting for key demographic variables, parent support, and home FV availability was conducted. In addition, separate linear regression models stratified by quartiles of home FV availability were conducted to assess the relationship between teacher support and FV intake by level of home FV availability. Results. Teacher support was not significantly related to students' FV intake (p = .44). However, the interaction of teacher support and home FV availability was positively associated with students' FV consumption (p < .05). For students in the lowest quartile of home FV availability, teacher support accounted for approximately 6% of the FV intake variance (p = .02). For higher levels of FV availability, teacher support and FV intake were not related. Conclusions. For lower income elementary school-aged children with low FV availability at home, greater teacher support may lead to modest increases in FV consumption.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A population based ecological study was conducted to identify areas with a high number of TB and HIV new diagnoses in Harris County, Texas from 2009 through 2010 by applying Geographic Information Systems to determine whether distinguished spatial patterns exist at the census tract level through the use of exploratory mapping. As of 2010, Texas has the fourth highest occurrence of new diagnoses of HIV/AIDS and TB.[31] The Texas Department of State Health Services (DSHS) has identified HIV infected persons as a high risk population for TB in Harris County.[29] In order to explore this relationship further, GIS was utilized to identify spatial trends. ^ The specific aims were to map TB and HIV new diagnoses rates and spatially identify hotspots and high value clusters at the census tract level. The potential association between HIV and TB was analyzed using spatial autocorrelation and linear regression analysis. The spatial statistics used were ArcGIS 9.3 Hotspot Analysis and Cluster and Outlier Analysis. Spatial autocorrelation was determined through Global Moran's I and linear regression analysis. ^ Hotspots and clusters of TB and HIV are located within the same spatial areas of Harris County. The areas with high value clusters and hotspots for each infection are located within the central downtown area of the city of Houston. There is an additional hotspot area of TB located directly north of I-10 and a hotspot area of HIV northeast of Interstate 610. ^ The Moran's I Index of 0.17 (Z score = 3.6 standard deviations, p-value = 0.01) suggests that TB is statistically clustered with a less than 1% chance that this pattern is due to random chance. However, there were a high number of features with no neighbors which may invalidate the statistical properties of the test. Linear regression analysis indicated that HIV new diagnoses rates (β=−0.006, SE=0.147, p=0.970) and census tracts (β=0.000, SE=0.000, p=0.866) were not significant predictors of TB new diagnoses rates. ^ Mapping products indicate that census tracts with overlapping hotspots and high value clusters of TB and HIV should be a targeted focus for prevention efforts, most particularly within central Harris County. While the statistical association was not confirmed, evidence suggests that there is a relationship between HIV and TB within this two year period.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: HIV associated B cell exhaustion is a notable characteristic of HIV viremic adults. However, it is not known if such alterations are present in perinatal HIV infected children, whose viral dynamics differs from those seen in adults. In the present study we perform an analysis of B cells subsets and measure antigen-specific memory B cells (MBC) in a pediatric HIV infected cohort. ^ Methods: Peripheral mononuclear cells (PBMC) of perinatal HIV infected individuals are characterized into naïve (CD21hi/CD27−), classic (CD27+), tissue like (CD21lo/CD27 −) and activated MBC (CD27+CD21− ) by FACS. A memory ELISPOT assay is used to detect antibody secreting cells. We measure total IgG and antibodies specific for influenza, HBV, mumps, measles, rubella and VZV. Memory was expressed as spot forming cells (SPC) /million of PBMC. Wilcoxon rank-sum was used to compare unpaired groups and linear regression analysis was used to determine predictors of B cell dysfunction ^ Results: 41 HIV perinatal infected children are included (51.2% females and 65.9% Black). Age at study is median (range) 8.78 years (4.39-11.57). At the time of testing they have a CD4% of 30.9 (23.2-39.4), a viral load (VL) of 1.95 log10 copies/ml (1.68-3.29) and a cumulative VL of 3.4 log10 copy × days (2.7-4.0). Ninety two percent of the children are on cARV for > 6 months. Overall, HIV+ children compared with controls have a significant lower number of IgG and antigen specific SFC. In addition, they have a lower proportion of classical MBC 12.9 (8.09-19.85) vs 29.4 (18.7-39.05); 0.01, but a significant higher proportion of tissue like memory MBC 6.01 (2.79-12.7) vs 0.99 (0.87-1.38); 0.003, compared with controls. Patients are parsed on VL (<400 and ≥ 400 copies/ml) with the objective to evaluate the effect of VL on B cell status. Patients with a VL ≥ 400 copies/ml have a significantly lower IgG, HBV, measles, rubella and VZV SPC compared with those with a VL < 400 copies/ml. There are no significant differences in B cell subpopulations between the groups. A moderate negative correlation was observed between the time of cARV initiation and the frequency of IgG memory B cells, suggesting that early initiation of cARV appears to lead to a better functionality of the IgG memory B cells (P=0.05). A statistically significant positive correlation was observed between the total number of IgG memory cells and the number of antigen-specific memory B cells/SPCs. Suggesting that the progressive recovery of the IgG memory B cell pull goes along with a progressive increase in the number of antigen-specific SPCs. ^ Conclusion: A pediatric cohort in overall good status with respect to HIV infection and on ART has defects in B cell function and numbers (reduced total and antigen specific MBC and increased tissue like and reduced classical MBC).^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diatoms are the major marine primary producers on the global scale and, recently, several methods have been developed to retrieve their abundance or dominance from satellite remote sensing data. In this work, we highlight the importance of the Southern Ocean (SO) in developing a global algorithm for diatom using an Abundance Based Approach (ABA). A large global in situ data set of phytoplankton pigments was compiled, particularly with more samples collected in the SO. We revised the ABA to take account of the information on the penetration depth (Zpd) and to improve the relationship between diatoms and total chlorophyll-a (TChla). The results showed that there is a distinct relationship between diatoms and TChla in the SO, and a new global model (ABAZpd) improved the estimation of diatoms abundance by 28% in the SO compared with the original ABA model. In addition, we developed a regional model for the SO which further improved the retrieval of diatoms by 17% compared with the global ABAZpd model. As a result, we found that diatom may be more abundant in the SO than previously thought. Linear trend analysis of diatom abundance using the regional model for the SO showed that there are statistically significant trends, both increasing and decreasing, in diatom abundance over the past eleven years in the region.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kelp forests represent a major habitat type in coastal waters worldwide and their structure and distribution is predicted to change due to global warming. Despite their ecological and economical importance, there is still a lack of reliable spatial information on their abundance and distribution. In recent years, various hydroacoustic mapping techniques for sublittoral environments evolved. However, in turbid coastal waters, such as off the island of Helgoland (Germany, North Sea), the kelp vegetation is present in shallow water depths normally excluded from hydroacoustic surveys. In this study, single beam survey data consisting of the two seafloor parameters roughness and hardness were obtained with RoxAnn from water depth between 2 and 18 m. Our primary aim was to reliably detect the kelp forest habitat with different densities and distinguish it from other vegetated zones. Five habitat classes were identified using underwater-video and were applied for classification of acoustic signatures. Subsequently, spatial prediction maps were produced via two classification approaches: Linear discriminant analysis (LDA) and manual classification routine (MC). LDA was able to distinguish dense kelp forest from other habitats (i.e. mixed seaweed vegetation, sand, and barren bedrock), but no variances in kelp density. In contrast, MC also provided information on medium dense kelp distribution which is characterized by intermediate roughness and hardness values evoked by reduced kelp abundances. The prediction maps reach accordance levels of 62% (LDA) and 68% (MC). The presence of vegetation (kelp and mixed seaweed vegetation) was determined with higher prediction abilities of 75% (LDA) and 76% (MC). Since the different habitat classes reveal acoustic signatures that strongly overlap, the manual classification method was more appropriate for separating different kelp forest densities and low-lying vegetation. It became evident that the occurrence of kelp in this area is not simply linked to water depth. Moreover, this study shows that the two seafloor parameters collected with RoxAnn are suitable indicators for the discrimination of different densely vegetated seafloor habitats in shallow environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long chain 1,13- and 1,15-alkyl diols form the base of a number of recently proposed proxies used for climate reconstruction. However, the sources of these lipids and environmental controls on their distribution are still poorly constrained. We have analyzed the long chain alkyl diol (LCD) composition of cultures of ten eustigmatophyte species, with three species from different families grown at various temperatures, to identify the effect of species composition and growth temperature on the LCD distribution. The results were compared with the LCD distribution of sixty-two lake surface sediments, and with previously reported LCD distributions from marine environments. The different families within the Eustigmatophyceae show distinct LCD patterns, with the freshwater family Eustigmataceae most closely resembling LCD distributions in both marine and lake environments. Unlike the other two eustigmatophyte families analyzed (Monodopsidaceae and Goniochloridaceae), C28 and C30 1,13-alkyl diols and C30 and C32 1,15-alkyl diols are all relatively abundant in the family Eustigmataceae, while the mono-unsaturated C32 1,15-alkyl diol was below detection limit. In contrast to the marine environment, LCD distributions in lakes did not show a clear relationship with temperature. The Long chain Diol Index (LDI), a proxy previously proposed for sea surface temperature reconstruction, showed a relatively weak correlation (R2 = 0.33) with mean annual air temperature used as an approximation for annual mean surface temperature of the lakes. A much-improved correlation (R2 = 0.74, p-value<0.001) was observed applying a multiple linear regression analysis between LCD distributions and lake temperatures reconstructed using branched tetraether lipid distributions. The obtained regression model provides good estimates of temperatures for cultures of the family Eustigmataceae, suggesting that algae belonging to this family have an important role as a source for LCDs in lacustrine environments, or, alternatively, that the main sources of LCDs are similarly affected by temperature as the Eustigmataceae. The results suggest that LCDs may have the potential to be applicable as a palaeotemperature proxy for lacustrine environments, although further calibration work is still required.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The recent development of in-situ monitoring devices, such as UV-spectrometers, makes the study of short-term stream chemistry variation relevant, especially the study of diurnal cycles, which are not yet fully understood. Our study is based on high-frequency data from an agricultural catchment (Studienlandschaft Schwingbachtal, Germany). We propose a novel approach, i.e. the combination of cluster analysis and Linear Discriminant Analysis, to mine from these data nitrate behavior patterns. As a result, we observe a seasonality of nitrate diurnal cycles, that differs from the most common cycle seasonality described in the literature, i.e. pre-dawn peaks in spring. Our cycles appear in summer and the maximum and minimum shift to a later time in late summer/autumn. This is observed both for water- and energy-limited years, thus potentially stressing the role of evapotranspiration. This concluding hypothesis on the role of evapotranspiration on nitrate stream concentration, which was obtained through data mining, broadens the perspective on the diurnal cycling of stream nitrate concentrations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several types of parallelism can be exploited in logic programs while preserving correctness and efficiency, i.e. ensuring that the parallel execution obtains the same results as the sequential one and the amount of work performed is not greater. However, such results do not take into account a number of overheads which appear in practice, such as process creation and scheduling, which can induce a slow-down, or, at least, limit speedup, if they are not controlled in some way. This paper describes a methodology whereby the granularity of parallel tasks, i.e. the work available under them, is efficiently estimated and used to limit parallelism so that the effect of such overheads is controlled. The run-time overhead associated with the approach is usually quite small, since as much work is done at compile time as possible. Also,a number of run-time optimizations are proposed. Moreover, a static analysis of the overhead associated with the granularity control process is performed in order to decide its convenience. The performance improvements resulting from the incorporation of grain size control are shown to be quite good, specially for systems with medium to large parallel execution overheads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a stress detection system based on fuzzy logic and two physiological signals: Galvanic Skin Response and Heart Rate. Instead of providing a global stress classification, this approach creates an individual stress templates, gathering the behaviour of individuals under situations with different degrees of stress. The proposed method is able to detect stress properly with a rate of 99.5%, being evaluated with a database of 80 individuals. This result improves former approaches in the literature and well-known machine learning techniques like SVM, k-NN, GMM and Linear Discriminant Analysis. Finally, the proposed method is highly suitable for real-time applications

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article considers static analysis based on abstract interpretation of logic programs over combined domains. It is known that analyses over combined domains provide more information potentially than obtained by the independent analyses. However, the construction of a combined analysis often requires redefining the basic operations for the combined domain. A practical approach to maintain precision in combined analyses of logic programs which reuses the individual analyses and does not redefine the basic operations is illustrated. The advantages of the approach are that proofs of correctness for the new domains are not required and implementations can be reused. The approach is demonstrated by showing that a combined sharing analysis — constructed from "old" proposals — compares well with other "new" proposals suggested in recent literature both from the point of view of efficiency and accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article describes a first group of theoretical and experimental works undertaken at the Polytechnic University of Madrid. One major purpose is to obtain a structural model for the assessment of historical Latin-American vertically laminated planked timber arches built by the Spanish, mainly in the XVII and XVIII centuries. Many of those constructions still stand and represent a notable historical heritage. Pedro Hurtado recently presented his Ph. D. thesis on historical and construction topics. A structural study was then undertaken. This step of the structural research focussed on static analysis, most especially the deformation in the connection system. This article describes part of this first structural research. Even though it is still at a basic level, it shows reasonable agreement with the experimental results. Further static analytical models are been now developed and implemented. The next stage will address the dynamic problem, even though improvements will be made also in the constitutive equations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a novel technique for identifying logically related sections of the heap such as recursive data structures, objects that are part of the same multi-component structure, and related groups of objects stored in the same collection/array. When combined withthe lifetime properties of these structures, this information can be used to drive a range of program optimizations including pool allocation, object co-location, static deallocation, and region-based garbage collection. The technique outlined in this paper also improves the efficiency of the static analysis by providing a normal form for the abstract models (speeding the convergence of the static analysis). We focus on two techniques for grouping parts of the heap. The first is a technique for precisely identifying recursive data structures in object-oriented programs based on the types declared in the program. The second technique is a novel method for grouping objects that make up the same composite structure and that allows us to partition the objects stored in a collection/array into groups based on a similarity relation. We provide a parametric component in the similarity relation in order to support specific analysis applications (such as a numeric analysis which would need to partition the objects based on numeric properties of the fields). Using the Barnes-Hut benchmark from the JOlden suite we show how these grouping methods can be used to identify various types of logical structures allowing the application of many region-based program optimizations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between abstract interpretation and partial evaluation has received considerable attention and (partial) integrations have been proposed starting from both the partial evaluation and abstract interpretation perspectives. In this work we present what we argüe is the first generic algorithm for efñcient and precise integration of abstract interpretation and partial evaluation from an abstract interpretation perspective. Taking as starting point state-of-the-art algorithms for context-sensitive, polyvariant abstract interpretation and (abstract) partial evaluation of logic programs, we present an algorithm which combines the best of both worlds. Key ingredients include the accurate success propagation inherent to abstract interpretation and the powerful program transformations achievable by partial deduction. In our algorithm, the calis which appear in the analysis graph are not analyzed w.r.t. the original definition of the procedure but w.r.t. specialized definitions of these procedures. Such specialized definitions are obtained by applying both unfolding and abstract executability. Also, our framework is parametric w.r.t. different control strategies and abstract domains. Different combinations of these parameters correspond to existing algorithms for program analysis and specialization. Our approach efficiently computes strictly more precise results than those achievable by each of the individual techniques. The algorithm is one of the key components of CiaoPP, the analysis and specialization system of the Ciao compiler.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several types of parallelism can be exploited in logic programs while preserving correctness and efficiency, i.e. ensuring that the parallel execution obtains the same results as the sequential one and the amount of work performed is not greater. However, such results do not take into account a number of overheads which appear in practice, such as process creation and scheduling, which can induce a slow-down, or, at least, limit speedup, if they are not controlled in some way. This paper describes a methodology whereby the granularity of parallel tasks, i.e. the work available under them, is efficiently estimated and used to limit parallelism so that the effect of such overheads is controlled. The run-time overhead associated with the approach is usually quite small, since as much work is done at compile time as possible. Also, a number of run-time optimizations are proposed. Moreover, a static analysis of the overhead associated with the granularity control process is performed in order to decide its convenience. The performance improvements resulting from the incorporation of grain size control are shown to be quite good, specially for systems with médium to large parallel execution overheads.