960 resultados para profitability analyzing
Resumo:
When a company desires to invest in a project, it must obtain resources needed to make the investment. The alternatives are using firm s internal resources or obtain external resources through contracts of debt and issuance of shares. Decisions involving the composition of internal resources, debt and shares in the total resources used to finance the activities of a company related to the choice of its capital structure. Although there are studies in the area of finance on the debt determinants of firms, the issue of capital structure is still controversial. This work sought to identify the predominant factors that determine the capital structure of Brazilian share capital, non-financial firms. This work was used a quantitative approach, with application of the statistical technique of multiple linear regression on data in panel. Estimates were made by the method of ordinary least squares with model of fixed effects. About 116 companies were selected to participate in this research. The period considered is from 2003 to 2007. The variables and hypotheses tested in this study were built based on theories of capital structure and in empirical researches. Results indicate that the variables, such as risk, size, and composition of assets and firms growth influence their indebtedness. The profitability variable was not relevant to the composition of indebtedness of the companies analyzed. However, analyzing only the long-term debt, comes to the conclusion that the relevant variables are the size of firms and, especially, the composition of its assets (tangibility).This sense, the smaller the size of the undertaking or the greater the representation of fixed assets in total assets, the greater its propensity to long-term debt. Furthermore, this research could not identify a predominant theory to explain the capital structure of Brazilian
Resumo:
Part 4: Transition Towards Product-Service Systems
Resumo:
In the last decade the complexity of the environment in which organizations are embedded increased dramatically, having on one side the increasingly demanding consumers in regard to the quality and value of the product and the other companies with the need to reduce operating costs in order to achieve greater profitability, without this there is a downturn in growth or market share powers. In this context the necessity of effectively structuring actions relating to the line with the operational work processes so that business objectives are achieved organizational strategic planning, ensuring the competitiveness of the organization. This study aims to analyze how you have made the management of the supply chain in a grocery retailer in the light of guidelines of Supply Chain Management by using the SCOR model. For realization of this study a survey was needed, classified according to their goals, exploratory and descriptive as to its procedure, document, field and case study. Thus, the processing of data will be qualitative merit, using the thematic categorical analysis of Bardin (1977). Thus, to obtain data interviews together the operational and strategic management of a company that was named Supermarket Omega were performed. After analyzing the information obtained is perceived that there is an effort of the organization enhance its management of the supply chain. However, there is a lack of alignment between the various areas that compose it. About their work processes, we stress that the focus of the company is still very directed on sailing than profitability, although it is undergoing a transformation in its organizational culture However, records the existence of many improvement projects in developing. Thus, it can be noticed that there is some consistency between the assumptions of the SCOR model and applied within the supply chain Omega Supermarket, although a greater effort to better align with the model still needs to be studied
Resumo:
This paper aims first to show the effect of the Entrepreneurial Orientation (EO) on SMEs financial performance, and second, to propose a contingency model which explores the moderating effects of environmental hostility of the relationship EO –financial performance -- To examine the research hypotheses, a sample of 121 manufacturing SMEs located in Catalonia, Spain has been used -- The results confirm a positive EO-financial performance relation, and suggest that a more positive relation exists when there is an adjustment between the EO and the environment -- Finally, the academic and entrepreneurial implications related to the EO and the SMEs environment are presented and discussed
Resumo:
We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.
Resumo:
Increases in oil prices after the economic recession have been surprising for domestic oil production in the United States since the beginning of 2009. Not only did the conventional oil extraction increase, but unconventional oil production and exploration also improved greatly with the favorable economic conditions. This favorable economy encourages companies to invest in new reservoirs and technological developments. Recently, enhanced drilling techniques including hydraulic fracturing and horizontal drilling have been supporting the domestic economy by way of unconventional shale and tight oil from various U.S. locations. One of the main contributors to this oil boom is the unconventional oil production from the North Dakota Bakken field. Horizontal drilling has increased oil production in the Bakken field, but the economic issues of unconventional oil extraction are still debatable due to volatile oil prices, high decline rates of production, a limited production period, high production costs, and lack of transportation. The economic profitability and viability of the unconventional oil play in the North Dakota Bakken was tested with an economic analysis of average Bakken unconventional well features. Scenario analysis demonstrated that a typical North Dakota Bakken unconventional oil well is profitable and viable as shown by three financial metrics; net present value, internal rate of return, and break-even prices.
Resumo:
The tidal influence on the Big Pine Key saltwater/freshwater interface was analyzed using time-lapse electrical resistivity imaging and shallow well measurements. The transition zone at the saltwater/freshwater interface was measured over part of a tidal cycle along three profiles. The resistivity was converted to salinity by deriving a formation factor for the Miami Oolite. A SEAWAT model was created to attempt to recreate the field measurements and test previously established hydrogeologic parameters. The results imply that the tide only affects the groundwater within 20 to 30 m of the coast. The effect is small and caused by flooding from the high tide. The low relief of the island means this effect is very sensitive to small changes in the magnitude. The SEAWAT model proved to be insufficient in modeling this effect. The study suggests that the extent of flooding is the largest influence on the salinity of the groundwater.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
The fact that most of the large scale solar PV plants are built in arid and semi-arid areas where land availability and solar radiation is high, it is expected the performance of the PV plants in such locations will be affected significantly due to high cell temperature as well as due to soiling. Therefore, it is essential to study how the different PV module technologies will perform in such geographical locations to ensure a consistent and reliable power delivery over the lifetime of the PV power plants. As soiling is strongly dependent on the climatic conditions of a particular location a test station, consisted of about 24 PV modules and a well-equipped weather station, was built within the fences of Scatec’s 75 MW Kalkbult solar PV plant in South Africa. This study was performed to a better understand the effect of soiling by comparing the relative power generation by the cleaned modules to the un-cleaned modules. Such knowledge can enable more quantitative evaluations of the cleaning strategies that are going to be implemented in bigger solar PV power plants. The data collected and recorded from the test station has been analyzed at IFE, Norway using a MatLab script written for this thesis project. This thesis work has been done at IFE, Norway in collaboration with Stellenbosch University in South Africa and Scatec Solar a Norwegian independent power producer company. Generally for the polycrystalline modules it is found that the average temperature corrected efficiency during the period of the experiment has been 15.00±0.08 % and for the thin film-CdTe with ARC is 11.52% and for the thin film without ARC is about 11.13% with standard uncertainty of ±0.01 %. Besides, by comparing the initial relative average efficiency of the polycrystalline-Si modules when all the modules have been cleaned for the first time and the final relative efficiency; after the last cleaning schedule which is when all the reference modules E, F, G, and H have been cleaned for the last time it is found that poly3 performs 2 % and 3 % better than poly1 and poly16 respectively, poly13 performs 1 % better than poly15 as well as poly5 and poly12 performs 1 % and 2 % better than poly10 respectively. Besides, poly5 and poly12 performs a 9 % and 11 % better than poly7. Furthermore, there is no change in performance between poly6 and poly9 as well as poly4 and poly15. However, the increase in performance of poly3 to poly1, poly13 to poly15 as well as poly5 and poly12 to poly10 is insignificant. In addition, it is found that TF22 perform 7% better than the reference un-cleaned module TF24 and similarly; TF21 performs 7% higher than TF23. Furthermore, modules with ARC glass (TF17, TF18, TF19, and TF20) shows that cleaning the modules with only distilled water (TF19) or dry-cleaned after cleaned with distilled water(TF20) decreases the performance of the modules by 5 % and 4 % comparing to its respective reference uncleanedmodules TF17 and TF18 respectively.
Resumo:
Al interior de la empresa ECOSISTEC S.A.S. existe un problema que está erosionando la rentabilidad y está comprometiendo la perdurabilidad de esta empresa, con el planteamiento de este proyecto aplicado se podrá resolver el problema de reputación que tiene a la empresa en un estado crítico y que ha limitado considerablemente su desempeño a través de los últimos años. A través de un análisis externo e interno, y de la aplicación de diferentes metodologías al interior de esta empresa se espera poder conocer la situación actual de la misma y del mercado colombiano, lo que a la final permitirá encontrar una solución óptima que mejore el estado actual de esta organización mejorando la cuota de mercado y la imagen que tienen los potenciales clientes de ECOSISTEC S.A.S.
Resumo:
Microorganisms are involved in the deterioration of Cultural Heritage. Thus, there is a need to enhance the techniques used for their detection and identification. RNA Fluorescent In Situ Hybridization (RNA-FISH) has been successfully applied for phylogenetic identification of the viable components of the microbial communities colonizing artworks both in situ and ex situ. Until recently, it was time-consuming, taking not less than 6 h for the analysis. We have developed an RNA-FISH in suspension protocol that allowed ex situ analysis of microorganisms involved in artworks’ biodeterioration in 5 h. In this work, three modified protocols, involving microwave heating, were evaluated for further shortening two of the four main critical steps in RNA-FISH: hybridization and washing. The original and modified protocols were applied in cellular suspensions of bacteria and yeast isolates. The results obtained were evaluated and compared in terms of detectability and specificity of the signals detected by epifluorescence microscopy. One of the methods tested showed good and specific FISH signals for all the microorganisms selected and did not produce signals evidencing non-specific or fixation-induced fluorescence. This 3 h protocol allows a remarkable reduction of the time usually required for performing RNA-FISH analysis in Cultural Heritage samples. Thus, a rapid alternative for analyzing yeast and bacteria cells colonizing artworks’ surfaces by RNA-FISH is presented in this work.
Resumo:
In recent years, Facebook and other social media have become key players in branding activities. However, empirical research is still needed about the way in which consumer-based brand equity is created on social media. The purpose of this paper is to study the relationship between masculine and feminine brand personality and brand equity, on Facebook, and to analyze the mediating role of consumer-brand engagement and brand love on this relationship. Data were collected using an online survey with 614 valid responses. The hypotheses were tested using structural equation modeling. Results support 7 of the 11 hypotheses with significant relationship between analyzed constructs. This study confirms the advantages of a clear gender positioning and extends prior research by suggesting that brands with a strong brand gender identity will encourage brand love. Results also highlight that brand love has a mediating role on the relationship between brand gender and overall brand equity.
Resumo:
Over the last decades, the growing evidence of human-caused climate change has raised awareness of the consequences of exceeding global temperature by 2˚C. This awareness has led to a contemporary approach to the conceptualization and management of green adaptation policies in spatial planning. This thesis aims to develop a comprehensive methodology for assessing the adaptability of existing neighborhoods to green strategies. The reliability of the proposed method is examined in the cities of Bologna and Imola and proved to be applicable in other geoghraphical locations. This thesis integrates three key themes of conceptual and implementation principles for urban green adaptation. This thesis initially defines methods for narrowing uncertainties in urban planning energy forecasting modeling by exploring the roles of integrated energy planning. The second is by exploring green retrofitting strategies in building, this thesis examines the effects of various energy-saving factors in roofing scenarios including a green roof, rooftop greenhouse, and insolated roof. Lastly, this thesis analyzes green strategies in urban spaces to enhance thermal comfort through facing urban heat exposure related to urban heat island effects. The roles of integrated energy policies and green strategic thinking are discussed to highlight various aspects of green adaptation on the neighborhood scale. This thesis develops approaches by which cities can face the challenges of current green urban planning and connect the conceptual and practical aspects of green spatial planning. Another point that this thesis highlight is that due to the interdependency of individuals and places, it is difficult to assure whether all the adaptation policies on a large scale are enhancing the resiliency of the neighborhood or they are simply shuffling the vulnerability through the individuals and places. Besides, it asserts that neglecting to reflect on these reallocations of the effects generates serious complications, and will result in long-term dysfunctional consequences.
Resumo:
Machine learning is widely adopted to decode multi-variate neural time series, including electroencephalographic (EEG) and single-cell recordings. Recent solutions based on deep learning (DL) outperformed traditional decoders by automatically extracting relevant discriminative features from raw or minimally pre-processed signals. Convolutional Neural Networks (CNNs) have been successfully applied to EEG and are the most common DL-based EEG decoders in the state-of-the-art (SOA). However, the current research is affected by some limitations. SOA CNNs for EEG decoding usually exploit deep and heavy structures with the risk of overfitting small datasets, and architectures are often defined empirically. Furthermore, CNNs are mainly validated by designing within-subject decoders. Crucially, the automatically learned features mainly remain unexplored; conversely, interpreting these features may be of great value to use decoders also as analysis tools, highlighting neural signatures underlying the different decoded brain or behavioral states in a data-driven way. Lastly, SOA DL-based algorithms used to decode single-cell recordings rely on more complex, slower to train and less interpretable networks than CNNs, and the use of CNNs with these signals has not been investigated. This PhD research addresses the previous limitations, with reference to P300 and motor decoding from EEG, and motor decoding from single-neuron activity. CNNs were designed light, compact, and interpretable. Moreover, multiple training strategies were adopted, including transfer learning, which could reduce training times promoting the application of CNNs in practice. Furthermore, CNN-based EEG analyses were proposed to study neural features in the spatial, temporal and frequency domains, and proved to better highlight and enhance relevant neural features related to P300 and motor states than canonical EEG analyses. Remarkably, these analyses could be used, in perspective, to design novel EEG biomarkers for neurological or neurodevelopmental disorders. Lastly, CNNs were developed to decode single-neuron activity, providing a better compromise between performance and model complexity.
Resumo:
As people spend a third of their lives at work and, in most cases, indoors, the work environment assumes crucial importance. The continuous and dynamic interaction between people and the working environment surrounding them produces physiological and psychological effects on operators. Recognizing the substantial impact of comfort and well-being on employee satisfaction and job performance, the literature underscores the need for industries to implement indoor environment control strategies to ensure long-term success and profitability. However, managing physical risks (i.e., ergonomic and microclimate) in industrial environments is often constrained by production and energy requirements. In the food processing industry, for example, the safety of perishable products dictates storage temperatures that do not allow for operator comfort. Conversely, warehouses dedicated to non-perishable products often lack cooling systems to limit energy expenditure, reaching high temperatures in the summer period. Moreover, exceptional events, like the COVID-19 pandemic, introduce new constraints, with recommendations impacting thermal stress and respiratory health. Furthermore, the thesis highlights how workers' variables, particularly the aging process, reduce tolerance to environmental stresses. Consequently, prolonged exposure to environmental stress conditions at work results in cardiovascular disease and musculoskeletal disorders. In response to the global trend of an aging workforce, the thesis bridges a literature gap by proposing methods and models that integrate the age factor into comfort assessment. It aims to present technical and technological solutions to mitigate microclimate risks in industrial environments, ultimately seeking innovative ways to enhance the aging workforce's comfort, performance, experience, and skills. The research outlines a logical-conceptual scheme with three main areas of focus: analyzing factors influencing the work environment, recognizing constraints to worker comfort, and designing solutions. The results significantly contribute to science by laying the foundation for new research in worker health and safety in an ageing working population's extremely current industrial context.