986 resultados para R-Statistical computing
Resumo:
2010. július 20-án megkezdte működését a magyar áramtőzsde, a HUPX. 2010. augusztus 16-án az első napokban tapasztalt 45-60 euró megawattórás ár helyett egyes órákban 2999 eurós árral szembesültek a piaci szereplők. A kiemelkedően magas árak megjelenése nem szokatlan az áramtőzsdéken a nemzetközi tapasztalatok szerint, sőt a kutatások kiemelten foglalkoznak az ún. ártüskék okainak felkutatásával, valamint megjelenésük kvantitatív és kvalitatív elemzésével. A cikkben a szerző bemutatja, milyen eredmények születtek a kiugró árak statisztikai vizsgálatai során a szakirodalomban, illetve azok következtetései hogyan állják meg a helyüket a magyar árak idősorát figyelembe véve. A szerző bemutat egy modellkeretet, amely a villamosenergia-árak viselkedését a hét órái szerint periodikusan váltakozó paraméterű eloszlásokkal írja le. A magyar áramtőzsde rövid története sajnos nem teszi lehetővé, hogy a hét minden órájára külön áreloszlást illeszthessünk. A szerző ezért a hét óráit két csoportba sorolja az ár eloszlásának jellege alapján: az ártüskék megjelenése szempontjából kockázatos és kevésbé kockázatos órákba. Ezután a HUPX-árak leírására felépít egy determinisztikus, kétállapotú rezsimváltó modellt, amellyel azonosítani lehet a kockázatos és kevésbé kockázatos órákat, valamint képet kaphatunk az extrém ármozgások jellegéről. / === / On 20th July, 2010 the Hungarian Power Exchange, the HUPX started its operation. On 16th August in certain hours the markets participants faced € 2,999 price instead of in the first days experienced 45-60 euros/mwh. According to the international experiences the appearance of the extremely high prices hasn’t been unusual in the power exchanges, the researches have focused exploring the causes of the so-called spikes and quantitative and qualitative analysis of those appearances. In this article the author describes what results were determined on statistical studies of outstanding prices in the literature, and how their conclusions stand up into account the time series of the Hungarian prices. The author presents a model framework which describes the behavior of electricity prices in the seven hours of periodically varying parameters. Unfortunately the brief history of the Hungarian Power Exchange does not allow to suit specific prices for each hour of week. Therefore the author classifies the hours of the week in the two groups based on the nature of price dispersion: according to the appearance of spikes to risky and less risky classes. Then for describing the HUPX prices the author builds a deterministic two-state, regime-changing model, which can be identified the risky and less risky hours, and to get a picture of the nature of extreme price movements.
Resumo:
Colleges base their admission decisions on a number of factors to determine which applicants have the potential to succeed. This study utilized data for students that graduated from Florida International University between 2006 and 2012. Two models were developed (one using SAT as the principal explanatory variable and the other using ACT as the principal explanatory variable) to predict college success, measured using the student’s college grade point average at graduation. Some of the other factors that were used to make these predictions were high school performance, socioeconomic status, major, gender, and ethnicity. The model using ACT had a higher R^2 but the model using SAT had a lower mean square error. African Americans had a significantly lower college grade point average than graduates of other ethnicities. Females had a significantly higher college grade point average than males.
Resumo:
Peer reviewed
Resumo:
We would like to thank the study participants and the clinical and research staff at the Queen Elizabeth National Spinal Injury Unit, as without them this study would not have been possible. We are grateful for the funding received from Glasgow Research Partnership in Engineering for the employment of SC during data collection for this study. We would like to thank the Royal Society of Edinburgh's Scottish Crucible scheme for providing the opportunity for this collaboration to occur. We are also indebted to Maria Dumitrascuta for her time and effort in producing inter-repeatability results for the shape models.
Resumo:
The work is supported in part by NSFC (Grant no. 61172070), IRT of Shaanxi Province (2013KCT-04), EPSRC (Grant no.Ep/1032606/1).
Resumo:
We would like to thank the study participants and the clinical and research staff at the Queen Elizabeth National Spinal Injury Unit, as without them this study would not have been possible. We are grateful for the funding received from Glasgow Research Partnership in Engineering for the employment of SC during data collection for this study. We would like to thank the Royal Society of Edinburgh's Scottish Crucible scheme for providing the opportunity for this collaboration to occur. We are also indebted to Maria Dumitrascuta for her time and effort in producing inter-repeatability results for the shape models.
Resumo:
The work is supported in part by NSFC (Grant no. 61172070), IRT of Shaanxi Province (2013KCT-04), EPSRC (Grant no.Ep/1032606/1).
Resumo:
Review paper, to appear in the Springer Lecture Notes in Physics volume "Thermal transport in low dimensions: from statistical physics to nanoscale heat transfer" (S. Lepri ed.)
Resumo:
With the growing pressure of eutrophication in tropical regions, the Mauritian shelf provides a natural situation to understand the variability in mesotrophic assemblages. Site-specific dynamics occur throughout the 1200 m depth gradient. The shallow assemblages divide into three types of warm-water mesotrophic foraminiferal assemblages, which is not only a consequence of high primary productivity restricting light to the benthos but due to low pore water oxygenation, shelf geomorphology, and sediment partitioning. In the intermediate depth (approx. 500 m), the increase in foraminiferal diversity is due to the cold-water coral habitat providing a greater range of micro niches. Planktonic species characterise the lower bathyal zone, which emphasizes the reduced benthic carbonate production at depth. Although, due to the strong hydrodynamics within the Golf, planktonic species occur in notable abundances through out the whole depth gradient. Overall, this study can easily be compared to other tropical marine settings investigating the long-term effects of tropical eutrophication and the biogeographic distribution of carbonate producing organisms.
Resumo:
The VRAG-R is designed to assess the likelihood of violent or sexual reoffending among male offenders. The data set comprises demographic, criminal history, psychological assessment, and psychiatric information about the offenders gathered from institutional files together with post-release recidivism information. The VRAG-R is a twelve-item actuarial instrument and the scores on these items form part of the data set. Because one of the goals of the VRAG-R development project was to compare the VRAG-R to the VRAG, subjects' VRAG scores are included in this data set. Access to the VRAG-R dataset is restricted. Contact Data Services, Queen's University Library (academic.services@queensu.ca) for access.
Resumo:
Poly(methylvinylether-co-maleic acid) (PMVE/MA) is commonly used as a component of pharmaceutical platforms, principally to enhance interactions with biological substrates (mucoadhesion). However, the limited knowledge on the rheological properties of this polymer and their relationships with mucoadhesion has negated the biomedical use of this polymer as a mono-component platform. This study presents a comprehensive study of the rheological properties of aqueous PMVE/MA platforms and defines their relationships with mucoadhesion using multiple regression analysis. Using dilute solution viscometry the intrinsic viscosities of un-neutralised PMVE/MA and PMVE/MA neutralised using NaOH or TEA were 22.32 ± 0.89 dL g-1, 274.80 ± 1.94 dL g-1 and 416.49 ± 2.21 dL g-1 illustrating greater polymer chain expansion following neutralisation using Triethylamine (TEA). PMVE/MA platforms exhibited shear-thinning properties. Increasing polymer concentration increased the consistencies, zero shear rate (ZSR) viscosities (determined from flow rheometry), storage and loss moduli, dynamic viscosities (defined using oscillatory analysis) and mucoadhesive properties, yet decreased the loss tangents of the neutralised polymer platforms. TEA neutralised systems possessed significantly and substantially greater consistencies, ZSR and dynamic viscosities, storage and loss moduli, mucoadhesion and lower loss tangents than their NaOH counterparts. Multiple regression analysis enabled identification of the dominant role of polymer viscoelasticity on mucoadhesion (r > 0.98). The mucoadhesive properties of PMVE/MA platforms were considerable and were greater than those of other platforms that have successfully been shown to enhance in vivo retention when applied to the oral cavity, indicating a positive role for PMVE/MA mono-component platforms for pharmaceutical and biomedical applications.
Resumo:
This dissertation covers two separate topics in statistical physics. The first part of the dissertation focuses on computational methods of obtaining the free energies (or partition functions) of crystalline solids. We describe a method to compute the Helmholtz free energy of a crystalline solid by direct evaluation of the partition function. In the many-dimensional conformation space of all possible arrangements of N particles inside a periodic box, the energy landscape consists of localized islands corresponding to different solid phases. Calculating the partition function for a specific phase involves integrating over the corresponding island. Introducing a natural order parameter that quantifies the net displacement of particles from lattices sites, we write the partition function in terms of a one-dimensional integral along the order parameter, and evaluate this integral using umbrella sampling. We validate the method by computing free energies of both face-centered cubic (FCC) and hexagonal close-packed (HCP) hard sphere crystals with a precision of $10^{-5}k_BT$ per particle. In developing the numerical method, we find several scaling properties of crystalline solids in the thermodynamic limit. Using these scaling properties, we derive an explicit asymptotic formula for the free energy per particle in the thermodynamic limit. In addition, we describe several changes of coordinates that can be used to separate internal degrees of freedom from external, translational degrees of freedom. The second part of the dissertation focuses on engineering idealized physical devices that work as Maxwell's demon. We describe two autonomous mechanical devices that extract energy from a single heat bath and convert it into work, while writing information onto memory registers. Additionally, both devices can operate as Landauer's eraser, namely they can erase information from a memory register, while energy is dissipated into the heat bath. The phase diagrams and the efficiencies of the two models are solved and analyzed. These two models provide concrete physical illustrations of the thermodynamic consequences of information processing.
Resumo:
Information concerning the run-time behaviour of programs ("program profiling") can be of the greatest assistance in improving program efficiency. Two software devices have been developed for use on ICL 1900 Series machines to provide such information. DIDYMUS is probabilistic in approach and uses multi- tasking facilities to sample the instruction addresses used by a program at run time. It will work regardless of the source language of the program and matches the detected addresses against a loader map to produce a histogram. SCAMP is restricted to profiling Algol 68-R programs, but provides deterministic information concerning those language constructs that are monitored. Procedure calls to appropriate counting routines are inserted into the source text in a pre-pass prior to compilation. The profile information is printed out at the end of the program run. It has been found that these two approaches complement each other very effectively.
Resumo:
A oportunidade de produção de biomassa microalgal tem despertado interesse pelos diversos destinos que a mesma pode ter, seja na produção de bioenergia, como fonte de alimento ou servindo como produto da biofixação de dióxido de carbono. Em geral, a produção em larga escala de cianobactérias e microalgas é feita com acompanhamento através de análises físicoquímicas offline. Neste contexto, o objetivo deste trabalho foi monitorar a concentração celular em fotobiorreator raceway para produção de biomassa microalgal usando técnicas de aquisição digital de dados e controle de processos, pela aquisição de dados inline de iluminância, concentração de biomassa, temperatura e pH. Para tal fim foi necessário construir sensor baseado em software capaz de determinar a concentração de biomassa microalgal a partir de medidas ópticas de intensidade de radiação monocromática espalhada e desenvolver modelo matemático para a produção da biomassa microalgal no microcontrolador, utilizando algoritmo de computação natural no ajuste do modelo. Foi projetado, construído e testado durante cultivos de Spirulina sp. LEB 18, em escala piloto outdoor, um sistema autônomo de registro de informações advindas do cultivo. Foi testado um sensor de concentração de biomassa baseado na medição da radiação passante. Em uma segunda etapa foi concebido, construído e testado um sensor óptico de concentração de biomassa de Spirulina sp. LEB 18 baseado na medição da intensidade da radiação que sofre espalhamento pela suspensão da cianobactéria, em experimento no laboratório, sob condições controladas de luminosidade, temperatura e fluxo de suspensão de biomassa. A partir das medidas de espalhamento da radiação luminosa, foi construído um sistema de inferência neurofuzzy, que serve como um sensor por software da concentração de biomassa em cultivo. Por fim, a partir das concentrações de biomassa de cultivo, ao longo do tempo, foi prospectado o uso da plataforma Arduino na modelagem empírica da cinética de crescimento, usando a Equação de Verhulst. As medidas realizadas no sensor óptico baseado na medida da intensidade da radiação monocromática passante através da suspensão, usado em condições outdoor, apresentaram baixa correlação entre a concentração de biomassa e a radiação, mesmo para concentrações abaixo de 0,6 g/L. Quando da investigação do espalhamento óptico pela suspensão do cultivo, para os ângulos de 45º e 90º a radiação monocromática em 530 nm apresentou um comportamento linear crescente com a concentração, apresentando coeficiente de determinação, nos dois casos, 0,95. Foi possível construir um sensor de concentração de biomassa baseado em software, usando as informações combinadas de intensidade de radiação espalhada nos ângulos de 45º e 135º com coeficiente de determinação de 0,99. É factível realizar simultaneamente a determinação inline de variáveis do processo de cultivo de Spirulina e a modelagem cinética empírica do crescimento do micro-organismo através da equação de Verhulst, em microcontrolador Arduino.
Resumo:
Three types of forecasts of the total Australian production of macadamia nuts (t nut-in-shell) have been produced early each year since 2001. The first is a long-term forecast, based on the expected production from the tree census data held by the Australian Macadamia Society, suitably scaled up for missing data and assumed new plantings each year. These long-term forecasts range out to 10 years in the future, and form a basis for industry and market planning. Secondly, a statistical adjustment (termed the climate-adjusted forecast) is made annually for the coming crop. As the name suggests, climatic influences are the dominant factors in this adjustment process, however, other terms such as bienniality of bearing, prices and orchard aging are also incorporated. Thirdly, industry personnel are surveyed early each year, with their estimates integrated into a growers and pest-scouts forecast. Initially conducted on a 'whole-country' basis, these models are now constructed separately for the six main production regions of Australia, with these being combined for national totals. Ensembles or suites of step-forward regression models using biologically-relevant variables have been the major statistical method adopted, however, developing methodologies such as nearest-neighbour techniques, general additive models and random forests are continually being evaluated in parallel. The overall error rates average 14% for the climate forecasts, and 12% for the growers' forecasts. These compare with 7.8% for USDA almond forecasts (based on extensive early-crop sampling) and 6.8% for coconut forecasts in Sri Lanka. However, our somewhatdisappointing results were mainly due to a series of poor crops attributed to human reasons, which have now been factored into the models. Notably, the 2012 and 2013 forecasts averaged 7.8 and 4.9% errors, respectively. Future models should also show continuing improvement, as more data-years become available.