927 resultados para distributions to shareholders


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Prediction of species' distributions is central to diverse applications in ecology, evolution and conservation science. There is increasing electronic access to vast sets of occurrence records in museums and herbaria, yet little effective guidance on how best to use this information in the context of numerous approaches for modelling distributions. To meet this need, we compared 16 modelling methods over 226 species from 6 regions of the world, creating the most comprehensive set of model comparisons to date. We used presence-only data to fit models, and independent presence-absence data to evaluate the predictions. Along with well-established modelling methods such as generalised additive models and GARP and BIOCLIM, we explored methods that either have been developed recently or have rarely been applied to modelling species' distributions. These include machine-learning methods and community models, both of which have features that may make them particularly well suited to noisy or sparse information, as is typical of species' occurrence data. Presence-only data were effective for modelling species' distributions for many species and regions. The novel methods consistently outperformed more established methods. The results of our analysis are promising for the use of data from museums and herbaria, especially as methods suited to the noise inherent in such data improve.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli selvittää osakeyhtiön varojenjakomuotoja ja niihin liittyvää yhtiöoikeudellista ja vero-oikeudellista sääntelyä Suomessa. Tutkimukselle asetettiin neljä tehtävää: selvittää miten osakeyhtiölaki säätelee varojen jakamista osakeyhtiöstä ja mitä muutoksia uusi osakeyhtiölaki tuo säätelyyn, tutkia miten osingonjakoa verotettiin yhtiöveron hyvitysjärjestelmän mukaan ja miten uusi verolainsäädäntö vaikuttaa voitonjaon verosuunnitteluun osakeyhtiössä sekä selvittää onko osakeyhtiöiden varojenjakoon kohdistuva verotus kiristynyt. Tutkimus osoitti, että osakeyhtiölaki säätelee hyvin tarkasti varojenjakoa osakeyhtiössä. Varojenjako on sallittua voitonjakokelpoisten varojen puitteissa. Osakeyhtiölakiesityksen mukaan varojenjaossa tulee huomioida myös yhtiön maksukykyisyys. Yhtiöverohyvitysjärjestelmän aikana osingonjakoon kohdistui yksinkertainen verotus. Uudessa järjestelmässä osinkoja verotetaan myös osingonsaajalla, jolloin verotus on osittain kahdenkertaista. Joissain tapauksissa osakkaille on edullisempaa maksaa yhtiöstä palkkaa kuin osinkoja. Uudistuksen seurauksena myös verosuunnittelunmerkitys kasvaa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is essentially twofold: first, to describe the use of spherical nonparametric estimators for determining statistical diagnostic fields from ensembles of feature tracks on a global domain, and second, to report the application of these techniques to data derived from a modern general circulation model. New spherical kernel functions are introduced that are more efficiently computed than the traditional exponential kernels. The data-driven techniques of cross-validation to determine the amount elf smoothing objectively, and adaptive smoothing to vary the smoothing locally, are also considered. Also introduced are techniques for combining seasonal statistical distributions to produce longer-term statistical distributions. Although all calculations are performed globally, only the results for the Northern Hemisphere winter (December, January, February) and Southern Hemisphere winter (June, July, August) cyclonic activity are presented, discussed, and compared with previous studies. Overall, results for the two hemispheric winters are in good agreement with previous studies, both for model-based studies and observational studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent observations from the EISCAT incoherent scatter radar have revealed bursts of poleward ion flow in the dayside auroral ionosphere which are consistent with the ionospheric signature of flux transfer events at the magnetopause. These bursts frequently contain ion drifts which exceed the neutral thermal speed and, because the neutral thermospheric wind is incapable of responding sufficiently rapidly, toroidal, non-Maxwellian ion velocity distributions are expected. The EISCAT observations are made with high time resolution (15 seconds) and at a large angle to the geomagnetic field (73.5°), allowing the non-Maxwellian nature of the distribution to be observed remotely for the first time. The observed features are also strongly suggestive of a toroidal distribution: characteristic spectral shape, increased scattered power (both consistent with reduced Landau damping and enhanced electric field fluctuations) and excessively high line-of-sight ion temperatures deduced if a Maxwellian distribution is assumed. These remote sensing observations allow the evolution of the distributions to be observed. They are found to be non-Maxwellian whenever the ion drift exceeds the neutral thermal speed, indicating that such distributions can exist over the time scale of the flow burst events (several minutes).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Social networks are common in digital health. A new stream of research is beginning to investigate the mechanisms of digital health social networks (DHSNs), how they are structured, how they function, and how their growth can be nurtured and managed. DHSNs increase in value when additional content is added, and the structure of networks may resemble the characteristics of power laws. Power laws are contrary to traditional Gaussian averages in that they demonstrate correlated phenomena. OBJECTIVES: The objective of this study is to investigate whether the distribution frequency in four DHSNs can be characterized as following a power law. A second objective is to describe the method used to determine the comparison. METHODS: Data from four DHSNs—Alcohol Help Center (AHC), Depression Center (DC), Panic Center (PC), and Stop Smoking Center (SSC)—were compared to power law distributions. To assist future researchers and managers, the 5-step methodology used to analyze and compare datasets is described. RESULTS: All four DHSNs were found to have right-skewed distributions, indicating the data were not normally distributed. When power trend lines were added to each frequency distribution, R(2) values indicated that, to a very high degree, the variance in post frequencies can be explained by actor rank (AHC .962, DC .975, PC .969, SSC .95). Spearman correlations provided further indication of the strength and statistical significance of the relationship (AHC .987. DC .967, PC .983, SSC .993, P<.001). CONCLUSIONS: This is the first study to investigate power distributions across multiple DHSNs, each addressing a unique condition. Results indicate that despite vast differences in theme, content, and length of existence, DHSNs follow properties of power laws. The structure of DHSNs is important as it gives insight to researchers and managers into the nature and mechanisms of network functionality. The 5-step process undertaken to compare actor contribution patterns can be replicated in networks that are managed by other organizations, and we conjecture that patterns observed in this study could be found in other DHSNs. Future research should analyze network growth over time and examine the characteristics and survival rates of superusers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The importance of competition between similar species in driving community assembly is much debated. Recently, phylogenetic patterns in species composition have been investigated to help resolve this question: phylogenetic clustering is taken to imply environmental filtering, and phylogenetic overdispersion to indicate limiting similarity between species. We used experimental plant communities with random species compositions and initially even abundance distributions to examine the development of phylogenetic pattern in species abundance distributions. Where composition was held constant by weeding, abundance distributions became overdispersed through time, but only in communities that contained distantly related clades, some with several species (i.e., a mix of closely and distantly related species). Phylogenetic pattern in composition therefore constrained the development of overdispersed abundance distributions, and this might indicate limiting similarity between close relatives and facilitation/complementarity between distant relatives. Comparing the phylogenetic patterns in these communities with those expected from the monoculture abundances of the constituent species revealed that interspecific competition caused the phylogenetic patterns. Opening experimental communities to colonization by all species in the species pool led to convergence in phylogenetic diversity. At convergence, communities were composed of several distantly related but species-rich clades and had overdispersed abundance distributions. This suggests that limiting similarity processes determine which species dominate a community but not which species occur in a community. Crucially, as our study was carried out in experimental communities, we could rule out local evolutionary or dispersal explanations for the patterns and identify ecological processes as the driving force, underlining the advantages of studying these processes in experimental communities. Our results show that phylogenetic relations between species provide a good guide to understanding community structure and add a new perspective to the evidence that niche complementarity is critical in driving community assembly.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of this study was to test the hypothesis that the aggregated state of natural marine particles constrains the sensitivity of optical beam attenuation to particle size. An instrumented bottom tripod was deployed at the 12-m node of the Martha's Vineyard Coastal Observatory to monitor particle size distributions, particle size-versus-settling-velocity relationships, and the beam attenuation coefficient (c(p)) in the bottom boundary layer in September 2007. An automated in situ filtration system on the tripod collected 24 direct estimates of suspended particulate mass (SPM) during each of five deployments. On a sampling interval of 5 min, data from a Sequoia Scientific LISST 100x Type B were merged with data from a digital floc camera to generate suspended particle volume size distributions spanning diameters from approximately 2 mu m to 4 cm. Diameter-dependent densities were calculated from size-versus-settling-velocity data, allowing conversion of the volume size distributions to mass distributions, which were used to estimate SPM every 5 min. Estimated SPM and measured c(p) from the LISST 100x were linearly correlated throughout the experiment, despite wide variations in particle size. The slope of the line, which is the ratio of c(p) to SPM, was 0.22 g m(-2). Individual estimates of c(p):SPM were between 0.2 and 0.4 g m(-2) for volumetric median particle diameters ranging from 10 to 150 mu m. The wide range of values in c(p):SPM in the literature likely results from three factors capable of producing factor-of-two variability in the ratio: particle size, particle composition, and the finite acceptance angle of commercial beam-transmissometers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This chapter takes a social theory of practice approach to examining institutional work; that is, how institutions are created, maintained, and disrupted through the actions, interactions, and negotiations of multiple actors. We examine alternative approaches that organizations use to deal with institutional pluralism based on a longitudinal real-time case study of a utility company grappling with opposing market and regulatory logics over time. These two logics required the firm to both mitigate its significant market power and also maintain its commercially competitive focus and responsiveness to shareholders. Institutional theorists have long acknowledged that institutions have a central logic (Friedland & Alford, 1991) or rationality (DiMaggio & Powell, 1983; Scott, 1995/2001; Townley, 2002), comprising a set of material and symbolic practices and organizing principles that provide logics of action for organizations and individuals, who then reproduce the institutions through their actions (Glynn & Lounsbury, 2005; Suddaby & Greenwood, 2005). Despite a monolithic feel to much institutional theory, in which a dominant institutional logic appears to prevail, institutional theorists also acknowledge the plurality of institutions (e.g. Friedland & Alford, 1991; Kraatz & Block, 2008; Lounsbury, 2007; Meyer & Rowan, 1977; Whittington, 1992). While these pluralistic institutions may be interdependent, they are not considered to coexist in harmony; “There is no question but that many competing and inconsistent logics exist in modern society” (Scott, 1995: 130).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We are the first to examine the market reaction to 13 announcement dates related to IFRS 9 for over 5400 European listed firms. We find an overall positive reaction to the introduction of IFRS 9. The regulation is particularly beneficial to shareholders of firms in countries with weaker rule of law and a smaller divergence between local GAAP and IAS 39. Bootstrap simulations rule out the possibility that sampling error or data mining are driving our findings. Our main findings are also robust to confounding events and the extent of the media coverage for each event. These results suggest that investors perceive the new regulation as shareholder-wealth enhancing and support the view that stronger comparability across accounting standards of European firms is beneficial to international investors and outweighs the costs of poorer firm-specific information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cloud edge mixing plays an important role in the life cycle and development of clouds. Entrainment of subsaturated air affects the cloud at the microscale, altering the number density and size distribution of its droplets. The resulting effect is determined by two timescales: the time required for the mixing event to complete, and the time required for the droplets to adjust to their new environment. If mixing is rapid, evaporation of droplets is uniform and said to be homogeneous in nature. In contrast, slow mixing (compared to the adjustment timescale) results in the droplets adjusting to the transient state of the mixture, producing an inhomogeneous result. Studying this process in real clouds involves the use of airborne optical instruments capable of measuring clouds at the `single particle' level. Single particle resolution allows for direct measurement of the droplet size distribution. This is in contrast to other `bulk' methods (i.e. hot-wire probes, lidar, radar) which measure a higher order moment of the distribution and require assumptions about the distribution shape to compute a size distribution. The sampling strategy of current optical instruments requires them to integrate over a path tens to hundreds of meters to form a single size distribution. This is much larger than typical mixing scales (which can extend down to the order of centimeters), resulting in difficulties resolving mixing signatures. The Holodec is an optical particle instrument that uses digital holography to record discrete, local volumes of droplets. This method allows for statistically significant size distributions to be calculated for centimeter scale volumes, allowing for full resolution at the scales important to the mixing process. The hologram also records the three dimensional position of all particles within the volume, allowing for the spatial structure of the cloud volume to be studied. Both of these features represent a new and unique view into the mixing problem. In this dissertation, holographic data recorded during two different field projects is analyzed to study the mixing structure of cumulus clouds. Using Holodec data, it is shown that mixing at cloud top can produce regions of clear but humid air that can subside down along the edge of the cloud as a narrow shell, or advect down shear as a `humid halo'. This air is then entrained into the cloud at lower levels, producing mixing that appears to be very inhomogeneous. This inhomogeneous-like mixing is shown to be well correlated with regions containing elevated concentrations of large droplets. This is used to argue in favor of the hypothesis that dilution can lead to enhanced droplet growth rates. I also make observations on the microscale spatial structure of observed cloud volumes recorded by the Holodec.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use the Kharzeev-Levin-Nardi (KLN) model of the low x gluon distributions to fit recent HERA data on F(L) and F(2)(c)(F(2)(b)). Having checked that this model gives a good description of the data, we use it to predict F(L) and F(2)(c) to be measured in a future electron-ion collider. The results are similar to those obtained with the de Florian-Sassot and Eskola-Paukkunen-Salgado nuclear gluon distributions. The conclusion of this exercise is that the KLN model, simple as it is, may still be used as an auxiliary tool to make estimates for both heavy-ion and electron-ion collisions.