917 resultados para Stand-Alone PV


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ensemble learning can be used to increase the overall classification accuracy of a classifier by generating multiple base classifiers and combining their classification results. A frequently used family of base classifiers for ensemble learning are decision trees. However, alternative approaches can potentially be used, such as the Prism family of algorithms that also induces classification rules. Compared with decision trees, Prism algorithms generate modular classification rules that cannot necessarily be represented in the form of a decision tree. Prism algorithms produce a similar classification accuracy compared with decision trees. However, in some cases, for example, if there is noise in the training and test data, Prism algorithms can outperform decision trees by achieving a higher classification accuracy. However, Prism still tends to overfit on noisy data; hence, ensemble learners have been adopted in this work to reduce the overfitting. This paper describes the development of an ensemble learner using a member of the Prism family as the base classifier to reduce the overfitting of Prism algorithms on noisy datasets. The developed ensemble classifier is compared with a stand-alone Prism classifier in terms of classification accuracy and resistance to noise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Practical realisation of Cyborgs opens up significant new opportunities in many fields. In particular when it comes to space travel many of the limitations faced by humans, in stand-alone form, are transposed by the adoption of a cyborg persona. In this article a look is taken at different types of Brain-Computer interface which can be employed to realise Cyborgs, biology-technology hybrids. e approach taken is a practical one with applications in mind, although some wider implications are also considered. In particular results from experiments are discussed in terms of their meaning and application possibilities. e article is written from the perspective of scientific experimentation opening up realistic possibilities to be faced in the future rather than giving conclusive comments on the technologies employed. Human implantation and the merger of biology and technology are though important elements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This manuscript describes the energy and water components of a new community land surface model called the Joint UK Land Environment Simulator (JULES). This is developed from the Met Office Surface Exchange Scheme (MOSES). It can be used as a stand alone land surface model driven by observed forcing data, or coupled to an atmospheric global circulation model. The JULES model has been coupled to the Met Office Unified Model (UM) and as such provides a unique opportunity for the research community to contribute their research to improve both world-leading operational weather forecasting and climate change prediction systems. In addition JULES, and its forerunner MOSES, have been the basis for a number of very high-profile papers concerning the land-surface and climate over the last decade. JULES has a modular structure aligned to physical processes, providing the basis for a flexible modelling platform.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A stand-alone sea ice model is tuned and validated using satellite-derived, basinwide observations of sea ice thickness, extent, and velocity from the years 1993 to 2001. This is the first time that basin-scale measurements of sea ice thickness have been used for this purpose. The model is based on the CICE sea ice model code developed at the Los Alamos National Laboratory, with some minor modifications, and forcing consists of 40-yr ECMWF Re-Analysis (ERA-40) and Polar Exchange at the Sea Surface (POLES) data. Three parameters are varied in the tuning process: Ca, the air–ice drag coefficient; P*, the ice strength parameter; and α, the broadband albedo of cold bare ice, with the aim being to determine the subset of this three-dimensional parameter space that gives the best simultaneous agreement with observations with this forcing set. It is found that observations of sea ice extent and velocity alone are not sufficient to unambiguously tune the model, and that sea ice thickness measurements are necessary to locate a unique subset of parameter space in which simultaneous agreement is achieved with all three observational datasets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to explore how companies that hold carbon trading accounts under European Union Emissions Trading Scheme (EU ETS) respond to the climate change by using disclosures on carbon emissions as a means to generate legitimacy compared to others. The study is based on disclosures made in annual reports and stand-alone sustainability reports of UK listed companies from 2001- 2012. The study uses content analysis to capture both the quality and volume of the carbon disclosures. The results show that there is a significant increase in both the quality and volume of the carbon disclosures after the launch of EU ETS. Companies with carbon trading accounts provide greater detailed disclosures as compared to the others without an account. We also find that company size is positively correlated with the disclosures while the association with the industry produces an inconclusive result.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present here a straightforward method which can be used to obtain a quantitative indication of an individual research output for an academic. Different versions, selections and options are presented to enable a user to easily calculate values both for stand-alone papers and overall for the collection of outputs for a person. The procedure is particularly useful as a metric to give a quantitative indication of the research output of a person over a time window. Examples are included to show how the method works in practice and how it compares to alternative techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Global warming has attracted attention from all over the world and led to the concern about carbon emission. Kyoto Protocol, as the first major international regulatory emission trading scheme, was introduced in 1997 and outlined the strategies for reducing carbon emission (Ratnatunga et al., 2011). As the increased interest in carbon reduction the Protocol came into force in 2005, currently there are already 191 nations ratifying the Protocol(UNFCCC, 2012). Under the cap-and-trade schemes, each company has its carbon emission target. When company’s carbon emission exceeds the target the company will either face fines or buy emission allowance from other companies. Thus unlike most of the other social and environmental issues carbon emission could trigger cost for companies in introducing low-emission equipment and systems and also emission allowance cost when they emit more than their targets. Despite the importance of carbon emission to companies, carbon emission reporting is still operating under unregulated environment and companies are only required to disclose when it is material either in value or in substances (Miller, 2005, Deegan and Rankin, 1997). Even though there is still an increase in the volume of carbon emission disclosures in company’s financial reports and stand-alone social and environmental reports to show their concern of the environment and also their social responsibility (Peters and Romi, 2009), the motivations behind corporate carbon emission disclosures and whether carbon disclosures have impact on corporate environmental reputation and financial performance have not yet to explore. The problems with carbon emission lie on both the financial side and non-financial side of corporate governance. On one hand corporate needs to spend money in reducing carbon emission or paying penalties when they emit more than allowed. On the other hand as the public are more interested in environmental issues than before carbon emission could also impact on the image of corporate regarding to its environmental performance. The importance of carbon emission issue are beginning to be recognized by companies from different industries as one of the critical issues in supply chain management (Lee, 2011) and 80% of companies analysed are facing carbon risks resulting from emissions in the companies’ supply chain as shown in a study conducted by the Investor Responsibility Research Centre Institute for Corporate Responsibility (IRRCI) and over 80% of the companies analysed found that the majority of greenhouse gas (GHG) emission are from electricity and other direct suppliers (Trucost, 2009). The review of extant literature shows the increased importance of carbon emission issues and the gap in the study of carbon reporting and disclosures and also the study which links corporate environmental reputation and corporate financial performance with carbon reporting (Lohmann, 2009a, Ratnatunga and Balachandran, 2009, Bebbington and Larrinaga-Gonzalez, 2008). This study would focus on investigating the current status of UK carbon emission disclosures, the determinant factors of corporate carbon disclosure, and the relationship between carbon emission disclosures and corporate environmental reputation and financial performance of UK listed companies from 2004-2012 and explore the explanatory power of classical disclosure theories.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The climate over the Arctic has undergone changes in recent decades. In order to evaluate the coupled response of the Arctic system to external and internal forcing, our study focuses on the estimation of regional climate variability and its dependence on large-scale atmospheric and regional ocean circulations. A global ocean–sea ice model with regionally high horizontal resolution is coupled to an atmospheric regional model and global terrestrial hydrology model. This way of coupling divides the global ocean model setup into two different domains: one coupled, where the ocean and the atmosphere are interacting, and one uncoupled, where the ocean model is driven by prescribed atmospheric forcing and runs in a so-called stand-alone mode. Therefore, selecting a specific area for the regional atmosphere implies that the ocean–atmosphere system can develop ‘freely’ in that area, whereas for the rest of the global ocean, the circulation is driven by prescribed atmospheric forcing without any feedbacks. Five different coupled setups are chosen for ensemble simulations. The choice of the coupled domains was done to estimate the influences of the Subtropical Atlantic, Eurasian and North Pacific regions on northern North Atlantic and Arctic climate. Our simulations show that the regional coupled ocean–atmosphere model is sensitive to the choice of the modelled area. The different model configurations reproduce differently both the mean climate and its variability. Only two out of five model setups were able to reproduce the Arctic climate as observed under recent climate conditions (ERA-40 Reanalysis). Evidence is found that the main source of uncertainty for Arctic climate variability and its predictability is the North Pacific. The prescription of North Pacific conditions in the regional model leads to significant correlation with observations, even if the whole North Atlantic is within the coupled model domain. However, the inclusion of the North Pacific area into the coupled system drastically changes the Arctic climate variability to a point where the Arctic Oscillation becomes an ‘internal mode’ of variability and correlations of year-to-year variability with observational data vanish. In line with previous studies, our simulations provide evidence that Arctic sea ice export is mainly due to ‘internal variability’ within the Arctic region. We conclude that the choice of model domains should be based on physical knowledge of the atmospheric and oceanic processes and not on ‘geographic’ reasons. This is particularly the case for areas like the Arctic, which has very complex feedbacks between components of the regional climate system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Satellite-based (e.g., Synthetic Aperture Radar [SAR]) water level observations (WLOs) of the floodplain can be sequentially assimilated into a hydrodynamic model to decrease forecast uncertainty. This has the potential to keep the forecast on track, so providing an Earth Observation (EO) based flood forecast system. However, the operational applicability of such a system for floods developed over river networks requires further testing. One of the promising techniques for assimilation in this field is the family of ensemble Kalman (EnKF) filters. These filters use a limited-size ensemble representation of the forecast error covariance matrix. This representation tends to develop spurious correlations as the forecast-assimilation cycle proceeds, which is a further complication for dealing with floods in either urban areas or river junctions in rural environments. Here we evaluate the assimilation of WLOs obtained from a sequence of real SAR overpasses (the X-band COSMO-Skymed constellation) in a case study. We show that a direct application of a global Ensemble Transform Kalman Filter (ETKF) suffers from filter divergence caused by spurious correlations. However, a spatially-based filter localization provides a substantial moderation in the development of the forecast error covariance matrix, directly improving the forecast and also making it possible to further benefit from a simultaneous online inflow error estimation and correction. Additionally, we propose and evaluate a novel along-network metric for filter localization, which is physically-meaningful for the flood over a network problem. Using this metric, we further evaluate the simultaneous estimation of channel friction and spatially-variable channel bathymetry, for which the filter seems able to converge simultaneously to sensible values. Results also indicate that friction is a second order effect in flood inundation models applied to gradually varied flow in large rivers. The study is not conclusive regarding whether in an operational situation the simultaneous estimation of friction and bathymetry helps the current forecast. Overall, the results indicate the feasibility of stand-alone EO-based operational flood forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Research in Bid Tender Forecasting Models (BTFM) has been in progress since the 1950s. None of the developed models were easy-to-use tools for effective use by bidding practitioners because the advanced mathematical apparatus and massive data inputs required. This scenario began to change in 2012 with the development of the Smartbid BTFM, a quite simple model that presents a series of graphs that enables any project manager to study competitors using a relatively short historical tender dataset. However, despite the advantages of this new model, so far, it is still necessary to study all the auction participants as an indivisible group; that is, the original BTFM was not devised for analyzing the behavior of a single bidding competitor or a subgroup of them. The present paper tries to solve that flaw and presents a stand-alone methodology useful for estimating future competitors’ bidding behaviors separately.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a parallel hardware architecture for image feature detection based on the Scale Invariant Feature Transform algorithm and applied to the Simultaneous Localization And Mapping problem. The work also proposes specific hardware optimizations considered fundamental to embed such a robotic control system on-a-chip. The proposed architecture is completely stand-alone; it reads the input data directly from a CMOS image sensor and provides the results via a field-programmable gate array coupled to an embedded processor. The results may either be used directly in an on-chip application or accessed through an Ethernet connection. The system is able to detect features up to 30 frames per second (320 x 240 pixels) and has accuracy similar to a PC-based implementation. The achieved system performance is at least one order of magnitude better than a PC-based solution, a result achieved by investigating the impact of several hardware-orientated optimizations oil performance, area and accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research investigates the factors that lead Latin American non-financial firms to manage risks using derivatives. The main focus is on currency risk management. With this purpose, this thesis is divided into an introduction and two main chapters, which have been written as stand-alone papers. The first paper describes the results of a survey on derivatives usage and risk management responded by the CFOs of 74 Brazilian non-financial firms listed at the São Paulo Stock Exchange (BOVESPA), and the main evidence found is: i) larger firms are more likely to use financial derivatives; ii) foreign exchange risk is the most managed with derivatives; iii) Brazilian managers are more concerned with legal and institutional aspects in using derivatives, such as the taxation and accounting treatment of these instruments, than with issues related to implementing and maintaining a risk management program using derivatives. The second paper studies the determinants of risk management with derivatives in four Latin American countries (Argentina, Brazil, Chile and Mexico). I investigate not only the decision of whether to use financial derivatives or not, but also the magnitude of risk management, measured by the notional value of outstanding derivatives contracts. This is the first study, to the best of my knowledge, to use derivatives holdings information in emerging markets. The use of a multi-country setting allows the analysis of institutional and economic factors, such as foreign currency indebtedness, the high volatility of exchange rates, the instability of political and institutional framework and the development of financial markets, which are issues of second-order importance in developed markets. The main contribution of the second paper is on the understanding of the relationship among currency derivatives usage, foreign debt and the sensitivity of operational earnings to currency fluctuations in Latin American countries. Unlikely previous findings for US firms, my evidence shows that derivatives held by Latin American firms are capable of producing cash flows comparable to financial expenses and investments, showing that derivatives are key instruments in their risk management strategies. It is also the first work to show strong and robust evidence that firms that benefit from local currency devaluation (e.g. exporters) have a natural currency hedge for foreign debt that allows them to bear higher levels of debt in foreign currency. This implies that firms under this revenue-cost structure require lower levels of hedging with derivatives. The findings also provide evidence that large firms are more likely to use derivatives, but the magnitude of derivatives holdings seems to be unrelated to the size of the firm, consistent with findings for US firms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last years the number of industrial applications for Augmented Reality (AR) and Virtual Reality (VR) environments has significantly increased. Optical tracking systems are an important component of AR/VR environments. In this work, a low cost optical tracking system with adequate attributes for professional use is proposed. The system works in infrared spectral region to reduce optical noise. A highspeed camera, equipped with daylight blocking filter and infrared flash strobes, transfers uncompressed grayscale images to a regular PC, where image pre-processing software and the PTrack tracking algorithm recognize a set of retro-reflective markers and extract its 3D position and orientation. Included in this work is a comprehensive research on image pre-processing and tracking algorithms. A testbed was built to perform accuracy and precision tests. Results show that the system reaches accuracy and precision levels slightly worse than but still comparable to professional systems. Due to its modularity, the system can be expanded by using several one-camera tracking modules linked by a sensor fusion algorithm, in order to obtain a larger working range. A setup with two modules was built and tested, resulting in performance similar to the stand-alone configuration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese tem por objetivo examinar as características do processo de decisão em que credores optam pela recuperação judicial ou liquidação da empresa em dificuldade financeira. O trabalho está dividido em quatro capítulos. No segundo capítulo, apresenta-se, de forma sistematizada, referencial teórico e evidências empíricas para apontar resultados importantes sobre estudos desenvolvidos nas áreas de recuperação de empresas e falência. O capítulo também apresenta três estudos de caso com o propósito de mostrar a complexidade de cada caso no que diz respeito à concentração de recursos, conflito de interesse entre as classes de credores e a decisão final sobre a aprovação ou rejeição do plano de recuperação judicial. No terceiro capítulo, analisam-se os determinantes do atraso pertinente à votação do plano de recuperação judicial. O trabalho propõe um estudo empírico dos atrasos entre 2005 e 2014. Os resultados sugerem que: (i) maior concentração da dívida entre as classes de credores possui relação com atrasos menores; (ii) maior quantidade de bancos para votar o plano de recuperação judicial possui relação com maiores atrasos; (iii) o atraso médio na votação diminui quando apenas uma classe de credores participa da votação do plano; (iv) credores trabalhistas e com garantia real atrasam a votação quando o valor dos ativos para garantir a dívida em caso de liquidação é maior; (v) o atraso médio na votação é maior em casos de pior desempenho do setor de atuação do devedor, sendo solicitado pelas classes quirografária e com garantia real; e (vi) a proposta de venda de ativos é o principal tópico discutido nas reuniões de votação do plano nos casos em que o atraso na votação é maior. Por fim, no quarto capítulo, apresenta-se evidência sobre a votação dos credores e a probabilidade de aprovação do plano de recuperação judicial. Os resultados sugerem que: (i) credores trabalhistas estão propensos a aprovar o plano de recuperação mesmo quando o plano é rejeitado pelas demais classes; (ii) planos com propostas de pagamento mais heterogêneas para as três classes de credores possuem menor chance de serem aceitos; (iii) a chance de aprovação do plano diminui nos casos em que mais credores quirografários participam da recuperação; e (iv) planos com proposta de venda de ativos possuem maior chance de serem aprovados. Finalmente, maior concentração da dívida na classe com garantia real diminui a chance de aprovação do plano, e o contrário ocorre na classe quirografária.