927 resultados para distributions to shareholders
Resumo:
The extent to which density-dependent processes regulate natural populations is the subject of an ongoing debate. We contribute evidence to this debate showing that density-dependent processes influence the population dynamics of the ectoparasite Aponomma hydrosauri (Acari: Ixodidae), a tick species that infests reptiles in Australia. The first piece of evidence comes from an unusually long-term dataset on the distribution of ticks among individual hosts. If density-dependent processes are influencing either host mortality or vital rates of the parasite population, and those distributions can be approximated with negative binomial distributions, then general host-parasite models predict that the aggregation coefficient of the parasite distribution will increase with the average intensity of infections. We fit negative binomial distributions to the frequency distributions of ticks on hosts, and find that the estimated aggregation coefficient k increases with increasing average tick density. This pattern indirectly implies that one or more vital rates of the tick population must be changing with increasing tick density, because mortality rates of the tick's main host, the sleepy lizard, Tiliqua rugosa, are unaffected by changes in tick burdens. Our second piece of evidence is a re-analysis of experimental data on the attachment success of individual ticks to lizard hosts using generalized linear modelling. The probability of successful engorgement decreases with increasing numbers of ticks attached to a host. This is direct evidence of a density-dependent process that could lead to an increase in the aggregation coefficient of tick distributions described earlier. The population-scale increase in the aggregation coefficient is indirect evidence of a density-dependent process or processes sufficiently strong to produce a population-wide pattern, and thus also likely to influence population regulation. The direct observation of a density-dependent process is evidence of at least part of the responsible mechanism.
The use of non-standard CT conversion ramps for Monte Carlo verification of 6 MV prostate IMRT plans
Resumo:
Monte Carlo (MC) dose calculation algorithms have been widely used to verify the accuracy of intensity-modulated radiotherapy (IMRT) dose distributions computed by conventional algorithms due to the ability to precisely account for the effects of tissue inhomogeneities and multileaf collimator characteristics. Both algorithms present, however, a particular difference in terms of dose calculation and report. Whereas dose from conventional methods is traditionally computed and reported as the water-equivalent dose (Dw), MC dose algorithms calculate and report dose to medium (Dm). In order to compare consistently both methods, the conversion of MC Dm into Dw is therefore necessary. This study aims to assess the effect of applying the conversion of MC-based Dm distributions to Dw for prostate IMRT plans generated for 6 MV photon beams. MC phantoms were created from the patient CT images using three different ramps to convert CT numbers into material and mass density: a conventional four material ramp (CTCREATE) and two simplified CT conversion ramps: (1) air and water with variable densities and (2) air and water with unit density. MC simulations were performed using the BEAMnrc code for the treatment head simulation and the DOSXYZnrc code for the patient dose calculation. The conversion of Dm to Dw by scaling with the stopping power ratios of water to medium was also performed in a post-MC calculation process. The comparison of MC dose distributions calculated in conventional and simplified (water with variable densities) phantoms showed that the effect of material composition on dose-volume histograms (DVH) was less than 1% for soft tissue and about 2.5% near and inside bone structures. The effect of material density on DVH was less than 1% for all tissues through the comparison of MC distributions performed in the two simplified phantoms considering water. Additionally, MC dose distributions were compared with the predictions from an Eclipse treatment planning system (TPS), which employed a pencil beam convolution (PBC) algorithm with Modified Batho Power Law heterogeneity correction. Eclipse PBC and MC calculations (conventional and simplified phantoms) agreed well (<1%) for soft tissues. For femoral heads, differences up to 3% were observed between the DVH for Eclipse PBC and MC calculated in conventional phantoms. The use of the CT conversion ramp of water with variable densities for MC simulations showed no dose discrepancies (0.5%) with the PBC algorithm. Moreover, converting Dm to Dw using mass stopping power ratios resulted in a significant shift (up to 6%) in the DVH for the femoral heads compared to the Eclipse PBC one. Our results show that, for prostate IMRT plans delivered with 6 MV photon beams, no conversion of MC dose from medium to water using stopping power ratio is needed. In contrast, MC dose calculations using water with variable density may be a simple way to solve the problem found using the dose conversion method based on the stopping power ratio.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a technique introduced to shape more precisely the dose distributions to the tumour, providing a higher dose escalation in the volume to irradiate and simultaneously decreasing the dose in the organs at risk which consequently reduces the treatment toxicity. This technique is widely used in prostate and head and neck (H&N) tumours. Given the complexity and the use of high doses in this technique it’s necessary to ensure as a safe and secure administration of the treatment, through the use of quality control programmes for IMRT. The purpose of this study was to evaluate statistically the quality control measurements that are made for the IMRT plans in prostate and H&N patients, before the beginning of the treatment, analysing their variations, the percentage of rejected and repeated measurements, the average, standard deviations and the proportion relations.
Resumo:
A avaliação de empresas sempre constituiu um tema de elevada reflexão, sendo que vários especialistas tentam encontrar os modelos que melhor se adaptam a situações específicas e para as quais precisam de determinar um valor. No contexto empresarial português começa a ganhar significância a prática da gestão orientada para a criação de valor (Value-Based Management). O conceito de Value-Based Management assistiu a um particular desenvolvimento nos últimos 20 anos como resultado da globalização e desregulamentação dos mercados financeiros, dos avanços nas tecnologias de informação e do aumento da importância dos investidores institucionais. Vários analistas apresentaram evidência de que as empresas que adotam sistemas VBM melhoram o seu desempenho económico em relação a outras de dimensão semelhante no mesmo setor. É neste contexto que o EVA (Economic Value Added) se apresenta como uma métrica de desempenho privilegiada nos processos de controlo das decisões estratégicas tomadas. No presente trabalho pretendemos abordar o conceito da gestão baseada na criação de valor e a sua importância para o acionista, o que implica rever outros modelos de avaliação tradicionais baseados no valor contabilístico. Como métrica de avaliação do desempenho passado da empresa ao nível da criação de valor vamos dar particular importância ao estudo do EVA, fazendo referência à possível correlação entre esta métrica e o MVA (Market Value Added). O objetivo principal é analisar empiricamente a relação do EVA como medida de desempenho associada à criação de valor para os acionistas com a performance da empresa. Com efeito, vamos efetuar um estudo de caso, que vai incidir sobre um grupo empresarial português, referência no seu setor de atividade, o Grupo Galp Energia, cotado na Euronext Lisbon. Pensamos que a crescente prática da gestão baseada na criação de valor nas empresas cotadas em Portugal e a necessidade de aferir os resultados desta, tornam esta investigação pertinente, para além do facto de serem poucos os estudos empíricos à questão da criação de valor e a sua correlação com o valor acrescentado de mercado e com o valor de mercado dos capitais próprios das empresas cotadas em Portugal.
Resumo:
The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.
Resumo:
Tese de Doutoramento Ciências Empresariais
Resumo:
Much attention has been paid to the effects of climate change on species' range reductions and extinctions. There is however surprisingly little information on how climate change driven threat may impact the tree of life and result in loss of phylogenetic diversity (PD). Some plant families and mammalian orders reveal nonrandom extinction patterns, but many other plant families do not. Do these discrepancies reflect different speciation histories and does climate induced extinction result in the same discrepancies among different groups? Answers to these questions require representative taxon sampling. Here, we combine phylogenetic analyses, species distribution modeling, and climate change projections on two of the largest plant families in the Cape Floristic Region (Proteaceae and Restionaceae), as well as the second most diverse mammalian order in Southern Africa (Chiroptera), and an herbivorous insect genus (Platypleura) in the family Cicadidae to answer this question. We model current and future species distributions to assess species threat levels over the next 70years, and then compare projected with random PD survival. Results for these animal and plant clades reveal congruence. PD losses are not significantly higher under predicted extinction than under random extinction simulations. So far the evidence suggests that focusing resources on climate threatened species alone may not result in disproportionate benefits for the preservation of evolutionary history.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
By integrating the agency and stakeholder perspectives, this study aims to provide a systematic understanding of the firm- and institutional-level corporate governance factors that affect corporate social performance (CSP). We analyze a large global panel dataset and reveal that CSP is positively associated with board independence, but negatively with ownership concentration. These results underscore the idea that the benefits of CSP do not flow to shareholders to the same extent as the costs and that the allocation of resources to CSP is lower when shareholders are powerful. Furthermore, these findings indicate that independent directors should be understood as agents in their own right, not only focused on defending shareholder interests. We also find that CSP is negatively related to investor protection and shareholder-oriented environments, while it is positively related to egalitarian environments. Finally, we jointly analyze firm-level drivers and institutional contexts.
Resumo:
Among the types of remote sensing acquisitions, optical images are certainly one of the most widely relied upon data sources for Earth observation. They provide detailed measurements of the electromagnetic radiation reflected or emitted by each pixel in the scene. Through a process termed supervised land-cover classification, this allows to automatically yet accurately distinguish objects at the surface of our planet. In this respect, when producing a land-cover map of the surveyed area, the availability of training examples representative of each thematic class is crucial for the success of the classification procedure. However, in real applications, due to several constraints on the sample collection process, labeled pixels are usually scarce. When analyzing an image for which those key samples are unavailable, a viable solution consists in resorting to the ground truth data of other previously acquired images. This option is attractive but several factors such as atmospheric, ground and acquisition conditions can cause radiometric differences between the images, hindering therefore the transfer of knowledge from one image to another. The goal of this Thesis is to supply remote sensing image analysts with suitable processing techniques to ensure a robust portability of the classification models across different images. The ultimate purpose is to map the land-cover classes over large spatial and temporal extents with minimal ground information. To overcome, or simply quantify, the observed shifts in the statistical distribution of the spectra of the materials, we study four approaches issued from the field of machine learning. First, we propose a strategy to intelligently sample the image of interest to collect the labels only in correspondence of the most useful pixels. This iterative routine is based on a constant evaluation of the pertinence to the new image of the initial training data actually belonging to a different image. Second, an approach to reduce the radiometric differences among the images by projecting the respective pixels in a common new data space is presented. We analyze a kernel-based feature extraction framework suited for such problems, showing that, after this relative normalization, the cross-image generalization abilities of a classifier are highly increased. Third, we test a new data-driven measure of distance between probability distributions to assess the distortions caused by differences in the acquisition geometry affecting series of multi-angle images. Also, we gauge the portability of classification models through the sequences. In both exercises, the efficacy of classic physically- and statistically-based normalization methods is discussed. Finally, we explore a new family of approaches based on sparse representations of the samples to reciprocally convert the data space of two images. The projection function bridging the images allows a synthesis of new pixels with more similar characteristics ultimately facilitating the land-cover mapping across images.
Resumo:
Improving safety at nighttime work zones is important because of the extra visibility concerns. The deployment of sequential lights is an innovative method for improving driver recognition of lane closures and work zone tapers. Sequential lights are wireless warning lights that flash in a sequence to clearly delineate the taper at work zones. The effectiveness of sequential lights was investigated using controlled field studies. Traffic parameters were collected at the same field site with and without the deployment of sequential lights. Three surrogate performance measures were used to determine the impact of sequential lights on safety. These measures were the speeds of approaching vehicles, the number of late taper merges and the locations where vehicles merged into open lane from the closed lane. In addition, an economic analysis was conducted to monetize the benefits and costs of deploying sequential lights at nighttime work zones. The results of this study indicates that sequential warning lights had a net positive effect in reducing the speeds of approaching vehicles, enhancing driver compliance, and preventing passenger cars, trucks and vehicles at rural work zones from late taper merges. Statistically significant decreases of 2.21 mph mean speed and 1 mph 85% speed resulted with sequential lights. The shift in the cumulative speed distributions to the left (i.e. speed decrease) was also found to be statistically significant using the Mann-Whitney and Kolmogorov-Smirnov tests. But a statistically significant increase of 0.91 mph in the speed standard deviation also resulted with sequential lights. With sequential lights, the percentage of vehicles that merged earlier increased from 53.49% to 65.36%. A benefit-cost ratio of around 5 or 10 resulted from this analysis of Missouri nighttime work zones and historical crash data. The two different benefitcost ratios reflect two different ways of computing labor costs.
A recent inventory of the bats of Mozambique with new documentation of 7 new species for the country
Resumo:
The bat fauna of Mozambique is poorly documented. We conducted a series of inventories across the country between 2005 and 2009, resulting in the identification of 50 species from 41 sites. Of these, seven species represent new national records that increase the country total to 67 species. These data include results from the first detailed surveys across northern Mozambique, over an area representing almost 50% of the country. We detail information on new distribution records and measurements of these specimens. Special attention is paid to the Rhinolophidae, because these include several taxa that are currently in a state of taxonomic confusion. Furthermore, we also present some notes on taxonomy, ecology and echolocation calls. Finally, we combine modelled distributions to present predicted species richness across the country. Species richness was lowest across the coastal plain, to the east and far north, and is predicted to increase in association with rising altitude and higher topographic unevenness of the landscape.
Resumo:
Tutkimuksen tavoitteenaoli analysoida uuden osakeyhtiölain vaikutuksia osakeyhtiön pääomarakenteeseen ja varojenjakoon verotukselliset kysymykset huomioiden. Tavoitteena oli tutkia, millaisia uusia mahdollisuuksia uusi osakeyhtiölaki tarjoaa oman pääoman rakenteen järjestämiseen ja varojen jakamiseen. Tarkastelun pääpaino oli pienissä, ei noteeratuissa yhtiöissä, joiden osakkeenomistajilla on paremmat mahdollisuudet hyödyntää uuden osakeyhtiölain tarjoamia aiempaa liberaalimpia mahdollisuuksia. Tutkimusmenetelmä oli kvalitatiivinen ja tutkimusta varten haastateltiin KHT-tilintarkastajaa, omistajayrittäjää, sekä osakeyhtiölain ja verotuksen asiantuntijoita. Tutkimuksen mukaan uuden osakeyhtiölain mahdollistamista aiempaa liberaalimmista menettelyistä uusi sijoitetun vapaan oman pääoman rahastoon merkittävin, mutta sen käytettävyys riippuu vielä avoimista verokysymyksistä. Uusi maksukykyisyystesti ja negatiivisen oman pääoman rekisteröintivelvollisuus aiheuttavat haasteita erityisesti pienille yhtiöille. Uusi osakeyhtiölaki on erittäin liberaali ja moderni, mutta sen merkitys riippuu viime kädessä vielä avoimista verokysymyksistä.
Resumo:
Tämän tutkielman päätavoitteena on ollut selvittää omien osakkeiden hankintaa ja hankintatarkoituksien järkevyyttä Suomessa, sekä tutkia aikaisempien empiiristen tutkimusten avulla, onko omien osakkeiden hankinta perusteltua myös osakkeenomistajien kannalta. Lisäksi tutkielman tavoitteena on nostaa esille sellaisia näkökohtia ja säännöksiä, joita osakkeenomistajien ja yritysjohdon tulisi painottaa ja ottaa huomioon tehdessään omien osakkeiden hankintapäätöksiä. Tämän tutkimuksen tutkimusmetodologia mukailee hyvin pitkälti käsiteanalyyttista tutkimusotetta. Tutkimuksen johtopäätökset ovat lähinnä toteavia ja suosittelevia. Empiria tulee mukaan käsiteanalyyttiselle tutkimusotteelle tyypilliseen tapaan eli aikaisempien jo olemassa olevien empiiristen tutkimusten muodossa.Suomessa omien osakkeiden hankinnasta saavutetut hyödyt eivät ole osakkeenomistajien kannalta aivan yhtä helposti saavutettavissa kuin Yhdysvalloissa. Suomessa omien osakkeiden käyttö vaihtoehtoisena voitonjaon keinona on osakkeenomistajien kannalta verotuksessa huonompi vaihtoehto kuin osingot. Tällä hetkellä omien osakkeiden hankinta on pikemminkin yhtiön keino rahoittaa yrityskauppoja, henkilökunnan optiojärjestelyjä, siistiä tasetta, jakaa ylimääräisiä kassavaroja osakkeenomistajille ja yritys vaikuttaa osakekurssiin. Omien osakkeiden hankinta on mahdollista toteuttaa Suomessa osakkeenomistajia hyödyntävällä tavalla, mutta omien osakkeiden hankintamenettelyä on kehitettävä mm. tekemällä todellisia omien osakkeiden hankintoja ja esittämällä hankintatarkoitukset luotettavammin. Empiiristen evidenssien mukaan omien osakkeiden hankinta esimerkiksi julkisella ostotarjouksella markkinahintaa korkeampaan hintaan voisi edesauttaa johdon välittämän ”sisäpiiritiedon” ymmärtämistä myös Suomessa.
Resumo:
The neutron skin thickness of nuclei is a sensitive probe of the nuclear symmetry energy and has multiple implications for nuclear and astrophysical studies. However, precision measurements of this observable are difficult to obtain. The analysis of the experimental data may imply some assumptions about the bulk or surface nature of the formation of the neutron skin. Here we study the bulk or surface character of neutron skins of nuclei following from calculations with Gogny, Skyrme, and covariant nuclear mean-field interactions. These interactions are successful in describing nuclear charge radii and binding energies but predict different values for neutron skins. We perform the study by fitting two-parameter Fermi distributions to the calculated self-consistent neutron and proton densities. We note that the equivalent sharp radius is a more suitable reference quantity than the half-density radius parameter of the Fermi distributions to discern between the bulk and surface contributions in neutron skins. We present calculations for nuclei in the stability valley and for the isotopic chains of Sn and Pb.