966 resultados para quantitative method
Resumo:
For supporting the Brazilian bovine encephalitis surveillance program this study examined the differential diagnosis of Neospora caninum in central nervous system (CNS) by histological analysis (HE staining), immunohistochemistry (IHC), and nested-PCR using a set of primers from the Nc5 region of the genomic DNA and ITS1 region of the ribosomal DNA. A sample of 302 cattle presenting neurological syndrome and negative for rabies, aged 0 to 18 years, from herds in 10 Brazilian states was evaluated for N caninum from January 2007 to April 2010. All specimens tested negative with IHC and nested-PCR using primers from the ITS I region of ribosomal DNA, while two positive cases (0.66%) were found using primers from the Nc5 region of genomic DNA: a 20 month-old male and a 72 month-old female, both from Sao Paulo State. Only the male presented severe multifocal necrotizing encephalitis associated with mononuclear cell infiltration, a pathognomonic lesion caused by parasites of the family Sarcocystidae, and only this case was associated with N caninum thus representing 0.33% positivity. Future studies should explore the association of IHC and nested-PCR with real-time PCR, a quantitative method that could be standardized for improving the detection of N. caninum in bovine CNS specimens.
Resumo:
OBJECTIVES: The aim of this study was to investigate the impact of asymptomatic vertebral fractures on the quality of life in older women as part of the Sao Paulo Ageing & Health Study. METHODS: This study was a cross-sectional study with a random sample of 180 women 65 years of age or older with or without vertebral fractures. The Quality of Life Questionnaire of the European Foundation for Osteoporosis was administered to all subjects. Anthropometric data were obtained by physical examination, and the body mass index was calculated. Lateral thoracic and lumbar spine X-ray scans were obtained to identify asymptomatic vertebral fractures using a semi-quantitative method. RESULTS: Women with asymptomatic vertebral fractures had lower total scores [61.4(15.3) vs. 67.1(14.2), p = 0.03] and worse physical function domain scores [69.5(20.1) vs. 77.3(17.1), p = 0.02] for the Quality of Life Questionnaire of the European Foundation for Osteoporosis compared with women without fractures. The total score of this questionnaire was also worse in women classified as obese than in women classified as overweight or normal. High physical activity was related to a better total score for this questionnaire (p = 0.01). Likewise, lower physical function scores were observed in women with higher body mass index values (p < 0.05) and lower physical activity levels (p < 0.05). Generalized linear models with gamma distributions and logarithmic link functions, adjusted for age, showed that lower total scores and physical function domain scores for the Quality of Life Questionnaire of the European Foundation for Osteoporosis were related to a high body mass index, lower physical activity, and the presence of vertebral fractures (p < 0.05). CONCLUSION: Vertebral fractures are associated with decreased quality of life mainly physical functioning in older community-dwelling women regardless of age, body mass index, and physical activity. Therefore, the results highlight the importance of preventing and controlling asymptomatic vertebral fractures to reduce their impact on quality of life among older women.
Resumo:
OBJECTIVES: The aim of this study was to investigate the impact of asymptomatic vertebral fractures on the quality of life in older women as part of the Sao Paulo Ageing & Health Study. METHODS: This study was a cross-sectional study with a random sample of 180 women 65 years of age or older with or without vertebral fractures. The Quality of Life Questionnaire of the European Foundation for Osteoporosis was administered to all subjects. Anthropometric data were obtained by physical examination, and the body mass index was calculated. Lateral thoracic and lumbar spine X-ray scans were obtained to identify asymptomatic vertebral fractures using a semi-quantitative method. RESULTS: Women with asymptomatic vertebral fractures had lower total scores [61.4(15.3) vs. 67.1(14.2), p = 0.03] and worse physical function domain scores [69.5(20.1) vs. 77.3(17.1), p = 0.02] for the Quality of Life Questionnaire of the European Foundation for Osteoporosis compared with women without fractures. The total score of this questionnaire was also worse in women classified as obese than in women classified as overweight or normal. High physical activity was related to a better total score for this questionnaire (p = 0.01). Likewise, lower physical function scores were observed in women with higher body mass index values (p<0.05) and lower physical activity levels (p,0.05). Generalized linear models with gamma distributions and logarithmic link functions, adjusted for age, showed that lower total scores and physical function domain scores for the Quality of Life Questionnaire of the European Foundation for Osteoporosis were related to a high body mass index, lower physical activity, and the presence of vertebral fractures (p<0.05). CONCLUSION: Vertebral fractures are associated with decreased quality of life mainly physical functioning in older community-dwelling women regardless of age, body mass index, and physical activity. Therefore, the results highlight the importance of preventing and controlling asymptomatic vertebral fractures to reduce their impact on quality of life among older women.
Resumo:
The prehistoric cemetery of Barshalder is located along the main road on the boundary between Grötlingbo and Fide parishes, near the southern end of the island of Gotland in the Baltic Sea. The cemetery was used from c. AD 1-1100. The level of publication in Swedish archaeology of the first millennium AD is low compared to, for instance, the British and German examples. Gotland’s rich Iron Age cemeteries have long been intensively excavated, but few have received monographic treatment. This publication is intended to begin filling this gap and to raise the empirical level of the field. It also aims to make explicit and test the often somewhat intuitively conceived results of much previous research. The analyses deal mainly with the Migration (AD 375–540), Vendel (AD 520–790) and Late Viking (AD 1000–1150) Periods. The following lines of inquiry have been prioritised. 1. Landscape history, i.e. placing the cemetery in a landscape-historical context. (Vol. 1, section 2.2.6) 2. Migration Period typochronology, i.e. the study of change in the grave goods. (Vol. 2, chapter 2) 3. Social roles: gender, age and status. (Vol. 2, chapter 3) 4. Religious identity in the 11th century, i.e. the study of religious indicators in mortuary customs and grave goods, with particular emphasis on the relationship between Scandinavian paganism and Christianity.. (Vol. 2, chapter 4) Barshalder is found to have functioned as a central cemetery for the surrounding area, located on peripheral land far away from contemporary settlement, yet placed on a main road along the coast for maximum visibility and possibly near a harbour. Computer supported correspondence analysis and seriation are used to study the gender attributes among the grave goods and the chronology of the burials. New methodology is developed to distinguish gender-neutral attributes from transgressed gender attributes. Sub-gender grouping due to age and status is explored. An independent modern chronology system with rigorous type definitions is established for the Migration Period of Gotland. Recently published chronology systems for the Vendel and Viking Periods are critically reviewed, tested and modified to produce more solid models. Social stratification is studied through burial wealth with a quantitative method, and the results are tested through juxtaposition with several other data types. The Late Viking Period graves of the late 10th and 11th centuries are studied in relation to the contemporary Christian graves at the churchyards. They are found to be symbolically soft-spoken and unobtrusive, with all pagan attributes kept apart from the body in a space between the feet of the deceased and the end of the over-long inhumation trench. A small number of pagan reactionary graves with more forceful symbolism are however also identified. The distribution of different 11th century cemetery types across the island is used to interpret the period’s confessional geography, the scale of social organisation and the degree of allegiance to western and eastern Christianity. 11th century society on Gotland is found to have been characterised by religious tolerance, by an absence of central organisation and by slow piecemeal Christianisation.
Resumo:
The prehistoric cemetery of Barshalder is located along the main road on the boundary between Grötlingbo and Fide parishes, near the southern end of the island of Gotland in the Baltic Sea. The ceme-tery was used from c. AD 1-1100. The level of publication in Swedish archaeology of the first millennium AD is low compared to, for instance, the British and German examples. Gotland’s rich Iron Age cemeteries have long been intensively excavated, but few have received monographic treatment. This publication is intended to begin filling this gap and to raise the empirical level of the field. It also aims to make explicit and test the often somewhat intuitively conceived re-sults of much previous research. The analyses deal mainly with the Migration (AD 375–540), Vendel (AD 520–790) and Late Viking (AD 1000–1150) Periods. The following lines of inquiry have been prioritised. 1. Landscape history, i.e. placing the cemetery in a landscape-historical context. (Vol. 1, section 2.2.6) 2. Migration Period typochronology, i.e. the study of change in the grave goods. (Vol. 2, chapter 2) 3. Social roles: gender, age and status. (Vol. 2, chapter 3) 4. Religious identity in the 11th century, i.e. the study of religious indicators in mortuary cus-toms and grave goods, with particular emphasis on the relationship between Scandinavian paganism and Christianity. (Vol. 2, chapter 4) Barshalder is found to have functioned as a central cemetery for the surrounding area, located on pe-ripheral land far away from contemporary settle-ment, yet placed on a main road along the coast for maximum visibility and possibly near a harbour. Computer supported correspondence analysis and seriation are used to study the gender attributes among the grave goods and the chronology of the burials. New methodology is developed to distin-guish gender-neutral attributes from transgressed gender attributes. Sub-gender grouping due to age and status is explored. An independent modern chronology system with rigorous type definitions is established for the Migration Period of Gotland. Recently published chronology systems for the Vendel and Viking Periods are critically reviewed, tested and modified to produce more solid models. Social stratification is studied through burial wealth with a quantitative method, and the results are tested through juxtaposition with several other data types. The Late Viking Period graves of the late 10th and 11th centuries are studied in relation to the contemporary Christian graves at the churchyards. They are found to be symbolically soft-spoken and unobtrusive, with all pagan attributes kept apart from the body in a space between the feet of the deceased and the end of the over-long inhumation trench. A small number of pagan reactionary graves with more forceful symbolism are however also identified. The distribution of different 11th cen-tury cemetery types across the island is used to in-terpret the period’s confessional geography, the scale of social organisation and the degree of alle-giance to western and eastern Christianity. 11th century society on Gotland is found to have been characterised by religious tolerance, by an absence of central organisation and by slow piecemeal Christianisation.
Resumo:
A polar stratospheric cloud submodel has been developed and incorporated in a general circulation model including atmospheric chemistry (ECHAM5/MESSy). The formation and sedimentation of polar stratospheric cloud (PSC) particles can thus be simulated as well as heterogeneous chemical reactions that take place on the PSC particles. For solid PSC particle sedimentation, the need for a tailor-made algorithm has been elucidated. A sedimentation scheme based on first order approximations of vertical mixing ratio profiles has been developed. It produces relatively little numerical diffusion and can deal well with divergent or convergent sedimentation velocity fields. For the determination of solid PSC particle sizes, an efficient algorithm has been adapted. It assumes a monodisperse radii distribution and thermodynamic equilibrium between the gas phase and the solid particle phase. This scheme, though relatively simple, is shown to produce particle number densities and radii within the observed range. The combined effects of the representations of sedimentation and solid PSC particles on vertical H2O and HNO3 redistribution are investigated in a series of tests. The formation of solid PSC particles, especially of those consisting of nitric acid trihydrate, has been discussed extensively in recent years. Three particle formation schemes in accordance with the most widely used approaches have been identified and implemented. For the evaluation of PSC occurrence a new data set with unprecedented spatial and temporal coverage was available. A quantitative method for the comparison of simulation results and observations is developed and applied. It reveals that the relative PSC sighting frequency can be reproduced well with the PSC submodel whereas the detailed modelling of PSC events is beyond the scope of coarse global scale models. In addition to the development and evaluation of new PSC submodel components, parts of existing simulation programs have been improved, e.g. a method for the assimilation of meteorological analysis data in the general circulation model, the liquid PSC particle composition scheme, and the calculation of heterogeneous reaction rate coefficients. The interplay of these model components is demonstrated in a simulation of stratospheric chemistry with the coupled general circulation model. Tests against recent satellite data show that the model successfully reproduces the Antarctic ozone hole.
Resumo:
Microparticelle a base di complessi polielettrolitici di Chitosano/Pectina per il rilascio nasale di Tacrina cloridrato. Lo scopo di questo studio è stata la ricerca di nuove formulazioni solide per la somministrazione nasale di Tacrina cloridrato allo scopo di ridurre l’eccessivo effetto di primo passaggio epatico ed aumentarne la biodisponibilità a livello del Sistema Nervoso Centrale. La Tacrina è stata incapsulata in microparticelle mucoadesive a base di complessi elettrolitici di chitosano e pectina. Le microparticelle sono state preparate mediante due diversi approcci tecnologici (spray-drying e spray-drying/liofilizzazione) e analizzate in termini di caratteristiche dimensionali, morfologiche e chimico-fisiche. Nanoparticelle di Chitosano reticolate con Sodio Cromoglicato per il trattamento della rinite allergica. Il Sodio Cromoglicato è uno dei farmaci utilizzati per il trattamento della rinite allergica. Come noto, la clearance mucociliare provoca una rapida rimozione dei farmaci in soluzione dalla cavità nasale, aumentando così il numero di somministrazioni giornaliere e, di conseguenza, riducendo la compliance del paziente. Per ovviare a tale problema, si è pensato di includere il sodio cromoglicato in nanoparticelle di chitosano, un polimero capace di aderire alla mucosa nasale, prolungare il contatto della formulazione con il sito di applicazione e ridurre il numero di somministrazioni giornaliere. Le nanoparticelle ottenute sono state caratterizzate in termini di dimensioni, resa, efficienza di incapsulazione e caricamento del farmaco, potenziale zeta e caratteristiche mucoadesive. Analisi quantitativa di Budesonide amorfa tramite calorimetria a scansione differenziale. È stato sviluppato un nuovo metodo quantitativo allo stato solido basato sulla Calorimetria a Scansione Differenziale (DSC) in grado di quantificare in modo selettivo e accurato la quantità di Budesonide amorfa presente in una miscela solida. Durante lo sviluppo del metodo sono stati affrontati problemi relativi alla convalida di metodi analitici su campioni solidi quali la miscelazione di polveri solide per la preparazione di miscele standard e il calcolo della precisione.
Resumo:
I present a new experimental method called Total Internal Reflection Fluorescence Cross-Correlation Spectroscopy (TIR-FCCS). It is a method that can probe hydrodynamic flows near solid surfaces, on length scales of tens of nanometres. Fluorescent tracers flowing with the liquid are excited by evanescent light, produced by epi-illumination through the periphery of a high NA oil-immersion objective. Due to the fast decay of the evanescent wave, fluorescence only occurs for tracers in the ~100 nm proximity of the surface, thus resulting in very high normal resolution. The time-resolved fluorescence intensity signals from two laterally shifted (in flow direction) observation volumes, created by two confocal pinholes are independently measured and recorded. The cross-correlation of these signals provides important information for the tracers’ motion and thus their flow velocity. Due to the high sensitivity of the method, fluorescent species with different size, down to single dye molecules can be used as tracers. The aim of my work was to build an experimental setup for TIR-FCCS and use it to experimentally measure the shear rate and slip length of water flowing on hydrophilic and hydrophobic surfaces. However, in order to extract these parameters from the measured correlation curves a quantitative data analysis is needed. This is not straightforward task due to the complexity of the problem, which makes the derivation of analytical expressions for the correlation functions needed to fit the experimental data, impossible. Therefore in order to process and interpret the experimental results I also describe a new numerical method of data analysis of the acquired auto- and cross-correlation curves – Brownian Dynamics techniques are used to produce simulated auto- and cross-correlation functions and to fit the corresponding experimental data. I show how to combine detailed and fairly realistic theoretical modelling of the phenomena with accurate measurements of the correlation functions, in order to establish a fully quantitative method to retrieve the flow properties from the experiments. An importance-sampling Monte Carlo procedure is employed in order to fit the experiments. This provides the optimum parameter values together with their statistical error bars. The approach is well suited for both modern desktop PC machines and massively parallel computers. The latter allows making the data analysis within short computing times. I applied this method to study flow of aqueous electrolyte solution near smooth hydrophilic and hydrophobic surfaces. Generally on hydrophilic surface slip is not expected, while on hydrophobic surface some slippage may exists. Our results show that on both hydrophilic and moderately hydrophobic (contact angle ~85°) surfaces the slip length is ~10-15nm or lower, and within the limitations of the experiments and the model, indistinguishable from zero.
Resumo:
Plutonium is present in the environment as a consequence of atmospheric nuclear tests, nuclear weapons production and industrial releases over the past 50 years. To study temporal trends, a high resolution Pu record was obtained by analyzing 52 discrete samples of an alpine firn/ice core from Colle Gnifetti (Monte Rosa, 4450 m a.s.l.), dating from 1945 to 1990. The 239Pu signal was recorded directly, without decontamination or preconcentration steps, using an Inductively Coupled Plasma - Sector Field Mass Spectrometer (ICP-SFMS) equipped with an high efficiency sample introduction system, thus requiring much less sample preparation than previously reported methods. The 239Pu profile reflects the three main periods of atmospheric nuclear weapons testing: the earliest peak lasted from 1954/55 to 1958 and was caused by the first testing period reaching a maximum in 1958. Despite a temporary halt of testing in 1959/60, the Pu concentration decreased only by half with respect to the 1958 peak due to long atmospheric residence times. In 1961/62 Pu concentrations rapidly increased reaching a maximum in 1963, which was about 40% more intense than the 1958 peak. After the signing of the "Limited Test Ban Treaty" between USA and USSR in 1964, Pu deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu concentrations with smaller peaks (about 20-30% of the 1964 peak) which might be related to the deposition of Saharan dust contaminated by the French nuclear tests of the 1960s. The data presented are in very good agreement with Pu profiles previously obtained from the Col du Dome ice core (by multi-collector ICP-MS) and Belukha ice core (by Accelerator Mass Spectrometry, AMS). Although a semi-quantitative method was employed here, the results are quantitatively comparable to previously published results.
Resumo:
A major challenge for a developing country such as Bangladesh is to supply basic services to its most marginalized populations, which includes both rural and urban dwellers. The government struggles to provide basic necessities such as water and electricity. In marginalized urban communities in Bangladesh, in particular informal settlements, meeting basic needs is even direr. Most informal settlements are built to respond to a rapid immigration to urban centers, and are thought of as ‘temporary structures’, though many structures have been there for decades. In addition, as the settlements are often squatting on private land, access to formalized services such as electricity or water is largely absent. In some cases, electricity and water connections are brought in - but through informal and non-government sanctioned ways -- these hookups are deemed ‘illegal’ by the state. My research will focus on recent efforts to help ameliorate issues associated with lack of basic services in informal settlements in Bangladesh – in this case lack of light. When the government fails to meet the needs of the general population, different non-government organizations tend to step in to intervene. A new emphasis on solar bottle systems in informal urban settlement areas to help address some energy needs (specifically day-time lighting). One such example is the solar bottle light in Bangladesh, a project introduced by the organization ‘Change’. There has been mixed reactions on this technology among the users. This is where my research intervenes. I have used quantitative method to investigate user satisfactions for the solar bottle lights among the residents of the informal settlements to address the overarching question, is there a disconnect between the perceived benefits of the ENGO and the user satisfaction of the residents of the informal settlements of Dhaka City? This paper uses survey responses to investigate level of user satisfaction and the contributing factors.
Resumo:
The integrin receptor $\alpha 4\beta 1$ is a cell surface heterodimer involved in a variety of highly regulated cellular interactions. The purpose of this dissertation was to identify and characterize unique structural and functional properties of the $\alpha 4\beta 1$ molecule that may be important for adhesion regulation and signal transduction. To study these properties and to establish a consensus sequence for the $\alpha 4$ subunit, cDNA encoding $\alpha 4$ was cloned and sequenced. A comparison with previously described human $\alpha 4$ sequences identified several substitutions in the $5\prime$ and $3\prime$ untranslated regions, and a nonsynonymous G to A transition in the coding region, resulting in a glutamine substitution for arginine. Further analysis of this single nucleotide substitution indicated that two variants of the $\alpha 4$ subunit exist, and when compared with three ancestrally-related species, the new form cloned in our laboratory was found to be evolutionarily conserved.^ The expression of $\alpha 4$ cDNA in transfected K562 erythroleukemia cells, and subsequent studies using flow cytofluorometric, immunochemical, and ligand binding/blocking analyses, confirmed $\alpha 4\beta 1$ as a receptor for fibronectin (FN) and vascular cell adhesion molecule-1 (VCAM-1), and provided a practical means of identifying two novel monoclonal antibody (mAb) binding epitopes on the $\alpha 4\beta 1$ complex that may play important roles in the regulation of leukocyte adhesion.^ To investigate the association of $\alpha 4\beta 1$-mediated adhesion with signals involved in the spreading of lymphocytes on FN, a quantitative method of analysis was developed using video microscopy and digital imaging. The results showed that HPB-ALL $(\alpha 4\beta 1\sp{\rm hi},\ \alpha 5\beta 1\sp-)$ cells could adhere and actively spread on human plasma FN, but not on control substrate. Many cell types which express different levels of the $\alpha 4\beta 1$ and $\alpha 5\beta 1$ FN binding integrins were examined for their ability to function in these events. Using anti-$\alpha 4$ and anti-$\alpha 5$ mAbs, it was determined that cell adhesion to FN was influenced by both $\beta 1$ integrins, while cell spreading was found to be dependent on the $\alpha 4\beta 1$ complex. In addition, inhibitors of phospholipase A$\sb2$ (PLA$\sb2$), 5-lipoxygenases, and cyclooxygenases blocked HPB-ALL cell spreading, yet had no effect on cell adhesion to FN, and the impaired spreading induced by the PLA$\sb2$ inhibitor cibacron blue was restored by the addition of exogenous arachidonic acid (AA). These results suggest that the interaction of $\alpha 4\beta 1$ with FN, the activation of PLA$\sb2,$ and the subsequent release of AA, may be involved in lymphocyte spreading. ^
Resumo:
Despite the popularity of the positron emitting glucose analog, ($\sp{18}$F) -2-deoxy-2-fluoro-D-glucose (2FDG), for the noninvasive "metabolic imaging" of organs with positron emission tomography (PET), the physiological basis for the tracer has not been tested, and the potential of 2FDG for the rapid kinetic analysis of altered glucose metabolism in the intact heart has not been fully exploited. We, therefore, developed a quantitative method to characterize metabolic changes of myocardial glucose metabolism noninvasively and with high temporal resolution.^ The first objective of the work was to provide direct evidence that the initial steps in the metabolism of 2FDG are the same as for glucose and that 2FDG is retained by the tissue in proportion to the rate of glucose utilization. The second objective was to characterize the kinetic changes in myocardial glucose transport and phosphorylation in response to changes in work load, competing substrates, acute ischemia and reperfusion, and the addition of insulin. To assess changes in myocardial glucose metabolism isolated working rat hearts were perfused with glucose and 2FDG. Tissue uptake of 2FDG and the input function were measured on-line by external detection. The steady state rate of 2FDG phosphorylation was determined by graphical analysis of 2FDG time-activity curves.^ The rate of 2FDG uptake was linear with time and the tracer was retained in its phosphorylated form. Tissue accumulation of 2FDG decreased within seconds with a reduction in work load, in the presence of competing substrates, and during reperfusion after global ischemia. Thus, most interventions known to alter glucose metabolism induced rapid parallel changes in 2FDG uptake. By contrast, insulin caused a significant increase in 2FDG accumulation only in hearts from fasted animals when perfused at a sub-physiological work load. The mechanism for this phenomenon is not known but may be related to the existence of two different glucose transporter systems and/or glycogen metabolism in the myocardial cell.^ It is concluded that (1) 2FDG traces glucose uptake and phosphorylation in the isolated working rat heart; and (2) early and transient kinetic changes in glucose metabolism can be monitored with high temporal resolution with 2FDG and a simple positron coincidence counting system. The new method has revealed transients of myocardial glucose metabolism, which would have remained unnoticed with conventional methods. These transients are not only important for the interpretation of glucose metabolic PET scans, but also provide insights into mechanisms of glucose transport and phosphorylation in heart muscle. ^
Resumo:
En el proceso general de la sociedad por la mejora continua hacia la Calidad, el sector de la construcción, y más específicamente la actividad de los Arquitectos con la redacción de los proyectos, no pueden, ni deben, quedar al margen. La presente investigación apunta un procedimiento para el control técnico de los proyectos y demuestra la eficacia y rentabilidad de éste o cualquier otro método de control, avanzando una aproximación a los modelos de costes de calidad de los estudios de arquitectura. El método de trabajo que se ha previsto para el desarrollo de la tesis cuenta con una base principal consistente en definir un procedimiento general de revisión de los proyectos, tipificando los principales errores (sistema de puntos de inspección), analizando las causas que los generan, su repercusión en el plazo, durabilidad y satisfacción del cliente, así como en definir un método de cuantificación que nos aproxime a la "importancia" (económica) que tienen o inducen los errores e indefiniciones detectadas. Para demostrar la validez de la hipótesis inicial sobre la rentabilidad de aplicar un sistema de control técnico del proyecto, se ha aplicado una parte del procedimiento general particularizado para la evaluación sistemática de los problemas, indefiniciones y fallos detectados, al que llamamos de forma simplificada Método Partícula Éste se aplica sobre una muestra de proyectos que se revisan y que, para que los resultados del análisis sean representativos, se seleccionaron de forma aleatoria, respondiendo topológicamente en sus variables definitorias a la población que se pretende estudiar: la totalidad de proyectos de ejecución de viviendas producidos en el ámbito territorial de Madrid (Comunidad) en el plazo comprendido entre los años 1990 y 1995. Pero además esta representatividad está condicionada a la mayor o menor exactitud de la valoración que se haga de los sobrecostos que puedan generar los errores e indefiniciones de los proyectos. Se hace pues imprescindible el tratar de objetivar al máximo los criterios de valoración de los mismos. Con los datos generados en la revisión de cada proyecto se analizan la totalidad de resultados de la muestra objeto de estudio, sacando conclusiones sobre frecuencia e importancia de los errores, incidencia de las variables que influyen, y posibles combinaciones de variables que incrementan el riesgo. Extrapolando el análisis al método general y a la población objeto de estudio, se extraen conclusiones significativas respecto a la eficacia del control técnico de proyectos, así como de las formas de optimizar las inversiones realizadas en el mismo, depurando y optimizando selectivamente el método general definido. Con el análisis de los modelos de coste de calidad se puede constatar cómo las inversiones en desarrollar sistemas de aseguramiento de la calidad de los proyectos, o, de forma más modesta, controlando la calidad técnica de los mismos, los estudios de arquitectura, amén del mejor servicio ofrecido a los clientes, y lo que ésto supone de permanencia en el mercado, mejoran significativamente su competitividad y su margen de beneficio, demostrando que son muy rentables tanto para los propios arquitectos, como para la sociedad en general. ABSTRACT The construction sector as a whole, and especifically architects as project drafters, must fully participate in the general process of society for continuous improvement towards quality. Our research outlines a procedure for the technical control of projects and shows the efficacy and cost-effectiveness of this or any other control method, putting fonvard an approach to quality cost-models in Architecture offices. Our procedure consists mainly in defining a general method of revising projects, typifying main errors (Points of inspection system), analysing their causes, their repercussion in clients' durability and satisfaction. It wHI also define a quantitative method to assess tfie economic importance of detected errors and indefinitions. To prove our initial hypothesis on the cost-effectiveness of applying a system of tecfinical control to projects we have applied part of the general procedure, adjusting it to the systematic evaluation of problems, indefinitions and errors we have found. This will be simply called Particular Method. We will use it on a sample of projects which were randomly selected, for the sake of representativeness, and which, in their defining variables, match the population under study topologically: every housing project in Madrid (Región) between 1.990 and 1.995. Furthermore, this representativeness is related to the accuracy of the assessment of the additional costs due to errors or lack of definition in the projects. We must therefore try to be precise in the evaluation of their costs. With data obtained from the revision of each project, we analyze every result from the sample under study, drawing conclusions about the frequency and importance of each error, their causing variables, and the combination of variables which are risk-increasing. By extrapolating this analysis to the General Method and the population under study, we draw significant conclusions on the effectiveness of the technical control of projects, as well as of the ways to optimise the investment in it, improving and optimising the General Method in a selective way. Analyzing quality cost-models, we can show how the investment in developing systems that will guarantee quality projects, or, more modestly, controlling the technical quality of them, Architecture firms will not only serve their clients best, with its impact on the firm's durability, but also improve their competitivity and their profit margin proving that they are profitable both for architects themselves and for the general public.
Resumo:
A disputa pela preferência do consumidor no cenário global gerou um quadro de crescente concorrência. O ponto-de-venda passou a destacar-se como meio de comunicação de marca após a profissionalização do varejo brasileiro, iniciada na década de 1980. Com isso, o ponto-de-venda passou a exigir pesquisas sobre produtos, sobre comportamento do consumidor e ferramentas promocionais específicas. O objetivo deste trabalho é testar a participação de materiais de merchandising no processo de compra do consumidor em supermercados. Para o desenvolvimento da pesquisa, foi escolhido o método quantitativo por meio da técnica experimental em ambiente natural, ou seja, em supermercados selecionados como Grupo Experimental e Grupo de Controle. O Grupo Experimental recebeu materiais de merchandising durante uma semana e as vendas foram comparadas com o Grupo de Controle. Na comparação entre grupos, foi registrado aumento de vendas de 27,86% no supermercado experimental em relação ao supermercado controle. Na comparação com a semana anterior ao experimento, ocorreu queda de 12,62% nas vendas do supermercado experimental. A queda no poder de compra do consumidor no período é uma das possíveis explicações para esse resultado.(AU)
Resumo:
A disputa pela preferência do consumidor no cenário global gerou um quadro de crescente concorrência. O ponto-de-venda passou a destacar-se como meio de comunicação de marca após a profissionalização do varejo brasileiro, iniciada na década de 1980. Com isso, o ponto-de-venda passou a exigir pesquisas sobre produtos, sobre comportamento do consumidor e ferramentas promocionais específicas. O objetivo deste trabalho é testar a participação de materiais de merchandising no processo de compra do consumidor em supermercados. Para o desenvolvimento da pesquisa, foi escolhido o método quantitativo por meio da técnica experimental em ambiente natural, ou seja, em supermercados selecionados como Grupo Experimental e Grupo de Controle. O Grupo Experimental recebeu materiais de merchandising durante uma semana e as vendas foram comparadas com o Grupo de Controle. Na comparação entre grupos, foi registrado aumento de vendas de 27,86% no supermercado experimental em relação ao supermercado controle. Na comparação com a semana anterior ao experimento, ocorreu queda de 12,62% nas vendas do supermercado experimental. A queda no poder de compra do consumidor no período é uma das possíveis explicações para esse resultado.(AU)