957 resultados para Two fluid model
Changes in mass and nutrient content of wood during decomposition in a south Florida mangrove forest
Resumo:
1. Large pools of dead wood in mangrove forests following disturbances such as hurricanes may influence nutrient fluxes. We hypothesized that decomposition of wood of mangroves from Florida, USA (Avicennia germinans, Laguncularia racemosa and Rhizophora mangle), and the consequent nutrient dynamics, would depend on species, location in the forest relative to freshwater and marine influences and whether the wood was standing, lying on the sediment surface or buried. 2. Wood disks (8–10 cm diameter, 1 cm thick) from each species were set to decompose at sites along the Shark River, either buried in the sediment, on the soil surface or in the air (above both the soil surface and high tide elevation). 3. A simple exponential model described the decay of wood in the air, and neither species nor site had any effect on the decay coefficient during the first 13 months of decomposition. 4. Over 28 months of decomposition, buried and surface disks decomposed following a two-component model, with labile and refractory components. Avicennia germinans had the largest labile component (18 ± 2% of dry weight), while Laguncularia racemosa had the lowest (10 ± 2%). Labile components decayed at rates of 0.37–23.71% month−1, while refractory components decayed at rates of 0.001–0.033% month−1. Disks decomposing on the soil surface had higher decay rates than buried disks, but both were higher than disks in the air. All species had similar decay rates of the labile and refractory components, but A. germinans exhibited faster overall decay because of a higher proportion of labile components. 5. Nitrogen content generally increased in buried and surface disks, but there was little change in N content of disks in the air over the 2-year study. Between 17% and 68% of total phosphorus in wood leached out during the first 2 months of decomposition, with buried disks having the greater losses, P remaining constant or increasing slightly thereafter. 6. Newly deposited wood from living trees was a short-term source of N for the ecosystem but, by the end of 2 years, had become a net sink. Wood, however, remained a source of P for the ecosystem. 7. As in other forested ecosystems, coarse woody debris can have a significant impact on carbon and nutrient dynamics in mangrove forests. The prevalence of disturbances, such as hurricanes, that can deposit large amounts of wood on the forest floor accentuates the importance of downed wood in these forests.
Resumo:
County jurisdictions in America are increasingly exercising self-government in the provision of public community services through the context of second order federalism. In states exercising this form of contemporary governance, county governments with "reformed" policy-making structures and professional management practices, have begun to rival or surpass municipalities in the delivery of local services with regional implications such as environmental protection (Benton 2002, 2003; Marando and Reeves, 1993). ^ The voter referendum, a form of direct democracy, is an important component of county land preservation and environmental protection governmental policies. The recent growth and success of land preservation voter referendums nationwide reflects an increase in citizen participation in government and their desire to protect vacant land and its natural environment from threats of over-development, urbanization and sprawl, loss of open space and farmland, deterioration of ecosystems, and inadequate park and recreational amenities. ^ The study's design employs a sequential, mixed method. First, a quantitative approach employs the Heckman two-step model. It is fitted with variables for the non-random sample of 227 voter referendum counties and all non-voter referendum counties in the U.S. from 1988 to 2009. Second, the qualitative data collected from the in-depth investigation of three South Florida county case studies with twelve public administrator interviews is transformed for integration with the quantitative findings. The purpose of the qualitative method is to complement, explain and enrich the statistical analysis of county demographic, socio-economic, terrain, regional, governance and government, political preference, environmentalism, and referendum-specific factors. ^ The research finds that government factors are significant in terms of the success of land preservation voter referendums; more specifically, the presence of self-government authority (home rule charter), a reformed structure (county administrator/manager or elected executive), and environmental interest groups. In addition, this study concludes that successful counties are often located coastal, exhibit population and housing growth, and have older and more educated citizens who vote democratic in presidential elections. The analysis of case study documents and public administrator interviews finds that pragmatic considerations of timing, local politics and networking of regional stakeholders are also important features of success. Further research is suggested utilizing additional public participation, local government and public administration factors.^
Resumo:
Given the role ethnic identity has as a protective factor against the effects of marginalization and discrimination (Umaña-Taylor, 2011), research longitudinally examining ethnic identity has become of increased importance. However, successful identity development must incorporate elements from both one's ethnic group and from the United States (Berry, 1980). Despite this, relatively few studies have jointly evaluated ethnic and American identity (Schwartz et al., 2012). The current dissertation, guided by three objectives, sought to address this and several other gaps in the literature. First, psychometric properties of the Multigroup Ethnic Identity Measure (MEIM) and the American Identity Measure (AIM) were evaluated. Secondly, the dissertation examined growth trends in recently immigrated Hispanic adolescents' and their caregivers' ethnic and American identity. Lastly, the relationship between adolescents' and caregivers' ethnic and American identity was evaluated. The study used an archival sample consisting of 301 recently immigrated Hispanic families collected from Miami (N = 151) and Los Angeles (N = 150). Consistent with previous research, results in Study 1 indicated a two-factor model reliably provided better fit than a one-factor model and established longitudinal invariance for the MEIM and the AIM. Results from Study 2 found significant growth in adolescents' American identity. While some differences were found across site and nationality, evidence suggested recently immigrated Hispanic adolescents were becoming more bicultural. Counterintuitively, results found a significant decline in caregivers' ethnic identity which future studies should further examine. Finally, results from Study 3, found several significant positive relationships between adolescents' and their caregivers' ethnic and American identity. Findings provided preliminary evidence for the importance of examining identity development within a systemic lens. Despite several limitations, these three studies represented a step forward in addressing the current gaps in the cultural identity literature. Implications for future investigation are discussed.
Resumo:
Prior research has shown that college women in the United States are experiencing significantly high rates of verbal intimate partner violence (IPV); estimates indicate that approximately 20-30% of college women experience verbal IPV victimization (e.g., Hines, 2007; Muñoz-Rivas, Graña, O'Leary, & González, 2009). Verbal IPV is associated with physical consequences, such as chronic pain and migraine headaches, and psychological implications, including anxiety, depression, suicidal ideation, and substance use (Coker et al., 2002). However, few studies have examined verbal IPV in college populations, and none have focused on Hispanic college women who are members of the largest minority population on college campuses today (Pew Research Center, 2013), and experience higher rates of IPV victimization (Ingram, 2007). The current dissertation sought to address these gaps by examining the influence of familial conflict strategies on Hispanic college women's verbal IPV victimization. Further, within group differences were explored, with specific attention paid to the role of acculturation and gender role beliefs. A total of 906 from two Hispanic Serving Institutions (HSI) in the southeastern (N=502) and southwestern (N=404) United States participated in the three part study. Study one examined the influence of parental conflict strategies on Hispanic women's verbal IPV victimization in current romantic relationships. Consistent with previous research, results indicated that parental use of verbal violence influenced verbal IPV victimization in the current romantic relationship. A unidirectional effect of paternal use of verbal aggression towards the participant on maternal verbal aggression towards the participant was also found. Study two examined the influence of parental conflict strategies, acculturation, and gender role beliefs on victimization. Acculturation and gender role beliefs were found to not have an influence on participants' verbal IPV victimization. Study three examined within-group differences using Study two's model. Differences were found between the southeastern and southwestern participants; gender role beliefs increased rates of verbal IPV victimization in the southeastern population. The current dissertation fills a gap in the literature on IPV experiences in Hispanic college populations, the importance of examining verbal IPV trends, and highlights importance differing cultural influences within populations traditionally viewed as homogenous. The implications for future research are discussed.^
Resumo:
The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.
Resumo:
Since the creation of supersonic vehicles, during the Second World War, the engineers have given special attention to the interaction between the aerodynamic efforts and the structures of the aircrafts due to a highly destructive phenomenon called flutter in aeronautical panel. Flutter in aeronautical panels is a self-excited aeroelastic phenomenon, which can occurs during supersonic flights due to dynamic instability of inertia, elastic and aerodynamic forces of the system. In the flutter condition, when the critical aerodynamic pressure is reached, the vibration amplitudes of the panel become dynamically unstable and increase exponentially with time, affecting significantly the fatigue life of the existing aeronautical components. Thus, in this paper, the interest is to investigate the possibility of reducing the effects of the supersonic aeroelastic instability of rectangular plates by applying passive constrained viscoelastic layers. The rationale for such study is the fact that as the addition of viscoelastic materials provides decreased vibration amplitudes it becomes important to quantify the suppression of plate flutter coalescence modes that can be obtained. Moreover, despite the fact that much research on the suppression of panel flutter has been carried out by using passive, semi-active and active control techniques, very few of them are adapted to deal with the problem of estimating the flutter speeds of viscoelastic systems, since they must conveniently account for the frequency- and temperature-dependent behavior of the viscoelastic material. In this context, two different model of viscoelastic material are developed and applied to the model of sandwich plate by using finite elements. After the presentation of the theoretical foundations of the methodology, the description of a numerical study on the flutter analysis of a three-layer sandwich plate is addressed.
Resumo:
Basement intersected in Holes 525A, 528, and 527 on the Walvis Ridge consists of submarine basalt flows and pillows with minor intercalated sediments. These holes are situated on the crest and mid- and lower NW flank of a NNW-SSE-trending ridge block which would have closely paralleled the paleo mid-ocean ridge. The basalts were erupted approximately 70 Ma, a date consistent with formation at the paleo mid-ocean ridge. The basalt types vary from aphyric quartz tholeiites on the Ridge crest to highly Plagioclase phyric olivine tholeiites on the flank. These show systematic differences in incompatible trace element and isotopic composition, and many element and isotope ratio pairs form systematic trends with the Ridge crest basalts at one end and the highly phyric Ridge flank basalts at the other. The low 143Nd/144Nd (0.51238) and high 87Sr/86Sr (0.70512) ratios of the Ridge crest basalts suggest derivation from an old Nd/Sm and Rb/Sr enriched mantle source. This isotopic signature is similar to that of alkaline basalts on Tristan da Cunha but offset by somewhat lower 143Nd/144Nd values. The isotopic ratio trends may be extrapolated beyond the Ridge flank basalts (which have 143Nd/144Nd of 0.51270 and 87Sr/86Sr of 0.70417) in the direction of typical MORB compositions. These isotopic correlations are equally consistent with mixing of depleted and enriched end-member melts or partial melting of an inhomogeneous, variably enriched mantle source. However, observed Zr-Ba-Nb-Y interelement relationships are inconsistent with any simple two-component model of magma mixing or partial melting. They also preclude extensive involvement of depleted (N-type) MORB material or its mantle sources in the petrogenesis of Walvis Ridge basalts.
Resumo:
A north-south transect of 17 cores was constructed along the eastern boundary of the California Current system from 33° to 42°N to investigate the changes in biogenic sedimentation over the past 30 kyr. Percentages and mass accumulation rates of CaCO3, Corg, and biogenic opal were assembled at 500 to 1000 years/sample to provide relatively high resolution. Time-space maps reveal a complex pattern of changes that do not follow a simple glacial-interglacial two-mode model. Biogenic sedimentation shows responses that are sometimes time-transgressive and sometimes coeval, and most of the responses show more consistency within a limited geographic area than any temporal consistency. Reconstructed conditions during late oxygen isotope stage 3 were more like early Holocene conditions than any other time during the last 30 kyr. Coastal upwelling and productivity during oxygen isotope stage 3 were relatively strong along the central California margin but were weak along the northern California margin. Precipitation increased during the last glacial interval in the central California region, and the waters of the southern California margin had relatively low productivity. Productivity on the southern Oregon margin was relatively low at the beginning of the last glacial interval, but by about 20 ka, productivity in this area significantly increased. This change suggests that the center of the divergence of the West Wind Drift shifted south at this time. The end of the last glacial interval was characterized by increased productivity in the southern California margin and increased upwelling along the central California margin but upwelling remained weak along the northern California margin. A sudden (<300 years) decrease in CaCO3, Corg, and biogenic opal occurred at 13 ka. The changes suggest a major reorientation of the atmospheric circulation in the North Pacific and western North America and the establishment of a strong seasonality in the central California region. A carbonate preservation event occurred at 10 ka that appears to reflect the uptake of CO2 by the terrestrial biosphere as the northern latitudes were reforested following retreat of the glaciers. The Holocene has been a period of relatively high productivity in the southern California margin, relatively strong coastal upwelling along the central California margin, relatively weak upwelling along the northern California margin, and the northward migration of the divergence zone of the West Wind Drift.
Resumo:
Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
Introdução: A lateralidade é a diferença na capacidade de controlo entre os dois lados do corpo. Os métodos utilizados para avaliar a lateralidade manual incluem a observação efetiva do uso do membro dominante ou a aplicação de inventários respondidos pelo próprio indivíduo avaliado. O Inventário de Lateralidade de Edinburgh (EHI) é o instrumento mais utilizado para avaliar a lateralidade manual. Apesar do seu uso amplo, em Portugal não existem estudos que avaliem a sua validade e fidedignidade. Objetivos: Estudar as propriedades psicométricas do Inventário de Lateralidade de Edinburgh numa amostra da população portuguesa. Métodos: A amostra é constituída por 290 pessoas (135 homens e 155 mulheres), com idades compreendidas entre os 18 e os 65 anos. Todos os participantes preencheram uma declaração de consentimento informado e uma bateria de testes neuropsicológicos Resultados: A média no EHI foi de 62,36 (DP = 38,00). Os resultados demonstraram que das seis variáveis sociodemográficas (idade, sexo, escolaridade, zona de residência, regiões e profissão) três apresentaram ter influência significativa nas pontuações do EHI: idade, zona de residência e regiões. A confiabilidade e a estabilidade temporal do EHI apresentaram resultados adequados. A análise fatorial confirmatória mostrou que o modelo não é melhor explicado por um fator. Para dois fatores o modelo continua a não ser adequado. Conclusão: Apesar de termos obtido uma boa consistência interna não nos é possível considerar este teste como o mais adequado para medir o constructo da lateralidade. / Introduction: The handedness is the difference in the control capacity between the two sides of the body. The methods used to evaluate the manual handedness include the effective observation of the use of dominant member or application of inventories answered by the person assessed. The Edinburgh Handedness Inventory (EHI) is the most used to evaluate manual handedness. Even though being widely used, in Portugal there are no studies that measure its validity and reliability. Objective: To study the psychometric properties of Edinburgh Handedness Inventory in a Portuguese sample. Methods: The sample consists of 290 people (135 men and 155 women), aged between 18 and 65 years. All participants filled an informed consent form and a battery of neuropsychological tests. Results: The average in EHI was 62.36 (SD = 38.00). The results showed that 3 of 6 sociodemographic variables showed significant influence in EHI scores. The reliability and temporal stability of EHI were adequate. Confirmatory factor analysis showed that the model is not better explained by one factor. A two-factor model was not also suitable. Conclusion: Even though we got a good internal consistency we cannot consider this test as the most appropriate for measuring the handedness construct.
Resumo:
This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.
Resumo:
Objetivo: Estudar as propriedades psicométricas e os dados normativos da Forma Geral das Matrizes Progressivas de Raven numa amostra da comunidade da população portuguesa. Método: A amostra é constituída por 697 pessoas (314 homens e 383 mulheres), com idades compreendidas entre os 12 e os 90 anos. Todos os participantes preencheram uma declaração de consentimento informado e uma bateria de testes neuropsicológicos, incluindo a Forma Geral das Matrizes Progressivas de Raven (FG-MPR), Teste de Memória de 15-Item de Rey, Escala de Autoavaliação de Ansiedade de Zung, Bateria de Avaliação Frontal e Figura Complexa de Rey. Resultados: A média na FG-MPR foi de 44,47 (DP = 10,78). Os resultados demonstraram que todas as variáveis sociodemográficas (idade, sexo, escolaridade, profissão, regiões e tipologia de áreas urbanas), exceto o estado civil, apresentaram ter influência significativa nas pontuações da FG-MPR. A confiabilidade e a estabilidade temporal da FG-MPR revelaram-se adequadas. A análise fatorial exploratória e confirmatória mostrou que o modelo para um fator não é adequado. Um modelo a quatro fatores continua a não ser adequado. Conclusão: Os dados do presente estudo sugerem que se trata de um instrumento com potencialidades na sua utilização junto da população portuguesa. / Purpose: To study the psychometric properties and date normative of the Raven’s Standard Progressive Matrices in a Portuguese community sample. Method: The sample consists of 697 people (314 men and 383 women), aged between 12 and 90 years. All participants filled an informed consent form and a battery of neuropsychological tests, which included Raven’s Standard Progressive Matrices (RSPM), Rey 15-Item Memory Test, Zung Self-Rating Anxiety Scale, Frontal Assessment Battery, and Rey Complex Figure Test. Results: The average in RSPM was 44.47 (SD = 10.78). The results showed that all of the sociodemographic variables (age, sex, education, profession, region, and typology of urban areas), with the exception of civil status, showed significant influence on RSPM scores. The reliability and temporal stability of RSPM were adequate. Exploratory and Confirmatory factor analysis showed that the model is not better explained by one factor. A two-factor model was not also suitable. Conclusion: The data from this study suggest that it is an instrument with potential for its use among the Portuguese population.
Resumo:
In Fall 2015, the Engineering and Physical Science Library (EPSL) began lending anatomical models as part of its course reserves program. EPSL received a partial skeleton and two muscle model figures from instructors of BSCI105. These models circulate for 4 hours at a time and are generally used by small, collaborative groups of students in the library. This poster will look at the challenges and rewards for adding these items to EPSL’s course reserves.