911 resultados para Mandatory e-procurement
Resumo:
To enhance the utilization of the wood, the sawmills are forced to place more emphasis on planning to master the whole production chain from the forest to the end product. One significant obstacle to integrating the forest-sawmill-market production chain is the lack of appropriate information about forest stands. Since the wood procurement point of view in forest planning systems has been almost totally disregarded there has been a great need to develop an easy and efficient pre-harvest measurement method, allowing separate measurement of stands prior to harvesting. The main purpose of this study was to develop a measurement method for pine stands which forest managers could use in describing the properties of the standing trees for sawing production planning. Study materials were collected from ten Scots pine stands (Pinus sylvestris) located in North Häme and South Pohjanmaa, in southern Finland. The data comprise test sawing data on 314 pine stems, dbh and height measures of all trees and measures of the quality parameters of pine sawlog stems in all ten study stands as well as the locations of all trees in six stands. The study was divided into four sub-studies which deal with pine quality prediction, construction of diameter and dead branch height distributions, sampling designs and applying height and crown height models. The final proposal for the pre-harvest measurement method is a synthesis of the individual sub-studies. Quality analysis resulted in choosing dbh, distance from stump height to the first dead branch (dead branch height), crown height and tree height as the most appropriate quality characteristics of Scots pine. Dbh and dead branch height are measured from each pine sample tree while height and crown height are derived from dbh measures by aid of mixed height and crown height models. Pine and spruce diameter distribution as well as dead branch height distribution are most effectively predicted by the kernel function. Roughly 25 sample trees seems to be appropriate in pure pine stands. In mixed stands the number of sample trees needs to be increased in proportion to the intensity of pines in order to attain the same level of accuracy.
Resumo:
The present study evaluates the feasibility of undelimbed Scots pine (Pinus sylvestris L.) for integrated production of pulp and energy in a kraft pulp mill from the technical, economic and environmental points of view, focusing on the potential of bundle harvesting. The feasibility of tree sections for pulp production was tested by conducting an industrial wood-handling experiment, laboratory cooking and bleaching trials, using conventional small-diameter Scots pine pulpwood as a reference. These trials showed that undelimbed Scots pine sections can be processed in favourable conditions as a blend with conventional small-diameter pulpwood without reducing the pulp quality. However, fibre losses at various phases of the process may increase when using undelimbed material. In the economic evaluation, both pulp production and wood procurement costs were considered, using the relative wood paying capability of a kraft pulp mill as a determinant. The calculations were made for three Scots pine first-thinning stands with the breast-height diameter of the removal (6 12 cm) as the main distinctive factor. The supply chains included in the comparison were based on cut-to-length harvesting, whole-tree harvesting and bundle harvesting (whole-tree bundling). With the current ratio of pulp and energy prices, the wood paying capability declines with an increase in the proportion of the energy fraction of the raw material. The supply system based on the cut-to-length method was the most efficient option, resulting in the highest residual value at stump in most cases. A decline in the pulp price and an increase in the energy price improved the competitiveness of the whole-tree systems. With short truck transportation distances and low pulp prices, however, the harvesting of loose whole trees can result in higher residual value at stump in small-diameter stands. While savings in transportation costs did not compensate for the high cutting and compaction costs by the second prototype of the bundle harvester, an increase in transportation distances improved its competitiveness. Since harvesting undelimbed assortments increases nutrient export from the site, which can affect soil productivity, the whole-tree alternatives included in the present study cannot be recommended on infertile peatlands and mineral soils. The harvesting of loose whole trees or bundled whole trees implies a reduction in protective logging residues and an increase in site traffic or payloads. These factors increase the risk of soil damage, especially on peat soils with poor bearing capacity. Within the wood procurement parameters which were examined, the CO2 emissions of the supply systems varied from 13 27 kg m3. Compaction of whole trees into bundles reduced emissions from transportation by 30 39%, but these reductions were insufficient to compensate for the increased emissions from cutting and compaction.
Resumo:
Finnish forest industry is in the middle of a radical change. Deepening recession and the falling demand of woodworking industry´s traditional products have forced also sawmilling industry to find new and more fertile solutions to improve their operational preconditions. In recent years, the role of bioenergy production has often been highlighted as a part of sawmills´ business repertoire. Sawmilling produces naturally a lot of by-products (e.g. bark, sawdust, chips) which could be exploited more effectively in energy production, and this would bring more incomes or maybe even create new business opportunities for sawmills. Production of bioenergy is also supported by government´s climate and energy policies favouring renewable energy sources, public financial subsidies, and soaring prices of fossil fuels. Also the decreasing production of domestic pulp and paper industry releases a fair amount of sawmills´ by-products for other uses. However, bioenergy production as a part of sawmills´ by-product utilization has been so far researched very little from a managerial point of view. The purpose of this study was to explore the relative significance of the main bioenergy-related processes, resources and factors at Finnish independent industrial sawmills including partnerships, cooperation, customers relationships and investments, and also the future perspectives of bioenergy business at these sawmills with the help of two resource-based approaches (resource-based view, natural-resource-based view). Data of the study comprised of secondary data (e.g. literature), and primary data which was attracted from interviews directed to sawmill managers (or equivalent persons in charge of decisions regarding bioenergy production at sawmill). While a literature review and the Delphi method with two questionnaires were utilized as the methods of the study. According to the results of the study, the most significant processes related to the value chain of bioenergy business are connected to raw material availability and procurement, and customer relationships management. In addition to raw material and services, the most significant resources included factory and machinery, personnel, collaboration, and geographic location. Long-term cooperation deals were clearly valued as the most significant form of collaboration, and especially in processes connected to raw material procurement. Study results also revealed that factors related to demand, subsidies and prices had highest importance in connection with sawmills´ future bioenergy business. However, majority of the respondents required that certain preconditions connected to the above-mentioned factors should be fulfilled before they will continue their bioenergy-related investments. Generally, the answers showed a wide divergence of opinions among the respondents which may refer to sawmills´ different emphases and expectations concerning bioenergy. In other words, bioenergy is still perceived as a quite novel and risky area of business at Finnish independent industrial sawmills. These results indicate that the massive expansion of bioenergy business at private sawmills in Finland is not a self-evident truth. The blocking barriers seem to be connected mainly to demand of bioenergy and money. Respondents´ answers disseminated a growing dissatisfaction towards the policies of authorities, which don´t treat equally sawmill-based bioenergy compared to other forms of bioenergy. This proposition was boiled down in a sawmill manager´s comment: “There is a lot of bioenergy available, if they just want to make use of it.” It seems that the positive effects of government´s policies favouring the renewables are not taking effect at private sawmills. However, as there anyway seems to be a lot of potential connected to emerging bioenergy business at Finnish independent industrial sawmills, there is also a clear need for more profound future studies over this topic.
Resumo:
Electronic Exchanges are double-sided marketplaces that allows multiple buyers to trade with multiple sellers, with aggregation of demand and supply across the bids to maximize the revenue in the market. In this paper, we propose a new design approach for an one-shot exchange that collects bids from buyers and sellers and clears the market at the end of the bidding period. The main principle of the approach is to decouple the allocation from pricing. It is well known that it is impossible for an exchange with voluntary participation to be efficient and budget-balanced. Budget-balance is a mandatory requirement for an exchange to operate in profit. Our approach is to allocate the trade to maximize the reported values of the agents. The pricing is posed as payoff determination problem that distributes the total payoff fairly to all agents with budget-balance imposed as a constraint. We devise an arbitration scheme by axiomatic approach to solve the payoff determination problem using the added-value concept of game theory.
Resumo:
Fuel cell-based automobiles have gained attention in the last few years due to growing public concern about urban air pollution and consequent environmental problems. From an analysis of the power and energy requirements of a modern car, it is estimated that a base sustainable power of ca. 50 kW supplemented with short bursts up to 80 kW will suffice in most driving requirements. The energy demand depends greatly on driving characteristics but under normal usage is expected to be 200 Wh/km. The advantages and disadvantages of candidate fuel-cell systems and various fuels are considered together with the issue of whether the fuel should be converted directly in the fuel cell or should be reformed to hydrogen onboard the vehicle. For fuel cell vehicles to compete successfully with conventional internal-combustion engine vehicles, it appears that direct conversion fuel cells using probably hydrogen, but possibly methanol, are the only realistic contenders for road transportation applications. Among the available fuel cell technologies, polymer-electrolyte fuel cells directly fueled with hydrogen appear to be the best option for powering fuel cell vehicles as there is every prospect that these will exceed the performance of the internal-combustion engine vehicles but for their first cost. A target cost of $ 50/kW would be mandatory to make polymer-electrolyte fuel cells competitive with the internal combustion engines and can only be achieved with design changes that would substantially reduce the quantity of materials used. At present, prominent car manufacturers are deploying important research and development efforts to develop fuel cell vehicles and are projecting to start production by 2005.
Resumo:
Wireless LAN (WLAN) market consists of IEEE 802.11 MAC standard conformant devices (e.g., access points (APs), client adapters) from multiple vendors. Certain third party certifications such as those specified by the Wi-Fi alliance have been widely used by vendors to ensure basic conformance to the 802.11 standard, thus leading to the expectation that the available devices exhibit identical MAC level behavior. In this paper, however, we present what we believe to be the first ever set of experimental results that highlight the fact that WLAN devices from different vendors in the market can have heterogeneous MAC level behavior. Specifically, we demonstrate with examples and data that in certain cases, devices may not be conformant with the 802.11 standard while in other cases, they may differ in significant details that are not a part of mandatory specifications of the standard. We argue that heterogeneous MAC implementations can adversely impact WLAN operations leading to unfair bandwidth allocation, potential break-down of related MAC functionality and difficulties in provisioning the capacity of a WLAN. However, on the positive side, MAC level heterogeneity can be useful in applications such as vendor/model level device fingerprinting.
Resumo:
Precision, sophistication and economic factors in many areas of scientific research that demand very high magnitude of compute power is the order of the day. Thus advance research in the area of high performance computing is getting inevitable. The basic principle of sharing and collaborative work by geographically separated computers is known by several names such as metacomputing, scalable computing, cluster computing, internet computing and this has today metamorphosed into a new term known as grid computing. This paper gives an overview of grid computing and compares various grid architectures. We show the role that patterns can play in architecting complex systems, and provide a very pragmatic reference to a set of well-engineered patterns that the practicing developer can apply to crafting his or her own specific applications. We are not aware of pattern-oriented approach being applied to develop and deploy a grid. There are many grid frameworks that are built or are in the process of being functional. All these grids differ in some functionality or the other, though the basic principle over which the grids are built is the same. Despite this there are no standard requirements listed for building a grid. The grid being a very complex system, it is mandatory to have a standard Software Architecture Specification (SAS). We attempt to develop the same for use by any grid user or developer. Specifically, we analyze the grid using an object oriented approach and presenting the architecture using UML. This paper will propose the usage of patterns at all levels (analysis. design and architectural) of the grid development.
Resumo:
Chiral 2-pyridylsulfinamides were shown to be effective catalysts in the alkylation of aryl and alkyl aldehydes with diethylzinc providing the corresponding alcohols in excellent enantioselectivity. Sulfinamide catalysts possessing solitary chirality at the sulfur center produced the product phenethyl alcohol in good enantioselectivity. Diastereomeric sulfinamides possessing chirality at the carbon-bearing nitrogen and at the sulfur of the sulfinamide increased the enantioselectivity of the product alcohols up to >99%. However, there is no effect of the match-mismatch pair of sulfinamide diastereomers on the outcome of the chiral induction of the product phenethyl alcohols. It was conclusively proved that chirality at the sulfur center is mandatory for obtaining good enantioselectivity in the reaction.
Resumo:
Himalayan region is one of the most active seismic regions in the world and many researchers have highlighted the possibility of great seismic event in the near future due to seismic gap. Seismic hazard analysis and microzonation of highly populated places in the region are mandatory in a regional scale. Region specific Ground Motion Predictive Equation (GMPE) is an important input in the seismic hazard analysis for macro- and micro-zonation studies. Few GMPEs developed in India are based on the recorded data and are applicable for a particular range of magnitudes and distances. This paper focuses on the development of a new GMPE for the Himalayan region considering both the recorded and simulated earthquakes of moment magnitude 5.3-8.7. The Finite Fault simulation model has been used for the ground motion simulation considering region specific seismotectonic parameters from the past earthquakes and source models. Simulated acceleration time histories and response spectra are compared with available records. In the absence of a large number of recorded data, simulations have been performed at unavailable locations by adopting Apparent Stations concept. Earthquakes recorded up to 2007 have been used for the development of new GMPE and earthquakes records after 2007 are used to validate new GMPE. Proposed GMPE matched very well with recorded data and also with other highly ranked GMPEs developed elsewhere and applicable for the region. Comparison of response spectra also have shown good agreement with recorded earthquake data. Quantitative analysis of residuals for the proposed GMPE and region specific GMPEs to predict Nepal-India 2011 earthquake of Mw of 5.7 records values shows that the proposed GMPE predicts Peak ground acceleration and spectral acceleration for entire distance and period range with lower percent residual when compared to exiting region specific GMPEs. Crown Copyright (C) 2013 Published by Elsevier Ltd. All rights reserved.
Resumo:
We consider the problem of Probably Ap-proximate Correct (PAC) learning of a bi-nary classifier from noisy labeled exam-ples acquired from multiple annotators(each characterized by a respective clas-sification noise rate). First, we consider the complete information scenario, where the learner knows the noise rates of all the annotators. For this scenario, we derive sample complexity bound for the Mini-mum Disagreement Algorithm (MDA) on the number of labeled examples to be ob-tained from each annotator. Next, we consider the incomplete information sce-nario, where each annotator is strategic and holds the respective noise rate as a private information. For this scenario, we design a cost optimal procurement auc-tion mechanism along the lines of Myer-son’s optimal auction design framework in a non-trivial manner. This mechanism satisfies incentive compatibility property,thereby facilitating the learner to elicit true noise rates of all the annotators.
Resumo:
Awareness for the need of sustainable and eco-friendly mobility has been increasing and various innovations are taking place in this regard. A study was carried out to assess the feasibility of installing solar photovoltaic (PV) modules atop train coaches. Most long-distance trains having LHB coaches do not have self-generating systems, thus making power cars mandatory to supply the required power for lighting loads. Feasibility of supplementing diesel generator sets with power from solar PV modules installed on coach rooftops has been reported in this communication. Not only is there a conservation of fuel, there is also a significant reduction in CO2 emissions. This work has shown that the area available on coach rooftops is more than sufficient to generate the required power, during sunlight hours, for the electrical loads of a non-A/C coach even during winter. All calculations were done keeping a standard route as the reference. Taking the cost of diesel to be Rs 66/litre, it was estimated that there will be annual savings of Rs 5,900,000 corresponding to 90,800 litres diesel per rake per year by implementing this scheme. The installation cost of solar modules would be recovered within 2-3 years. Implementation of this scheme would also amount to an annual reduction of 239 tonnes of CO2 emissions.
Electrical and optical spectroscopy for quantitative screening of hepatic steatosis in donor livers.
Resumo:
Macro-steatosis in deceased donor livers is increasingly prevalent and is associated with poor or non-function of the liver upon reperfusion. Current assessment of the extent of steatosis depends upon the macroscopic assessment of the liver by the surgeon and histological examination, if available. In this paper we demonstrate electrical and optical spectroscopy techniques which quantitatively characterize fatty infiltration in liver tissue. Optical spectroscopy showed a correlation coefficient of 0.85 in humans when referenced to clinical hematoxylin and eosin (H&E) sections in 20 human samples. With further development, an optical probe may provide a comprehensive measure of steatosis across the liver at the time of procurement.
Resumo:
Resumen: Este artículo estudia las teorías sobre el origen, contagio y control del cólera en el siglo XIX, los intentos de las autoridades de la Argentina por contrarrestar estas epidemias y por último, la campaña anticolérica de 1910. Hasta ese momento, las medidas preventivas habían priorizado la vigilancia, desinfección y aislamiento de viviendas, objetos y personas infectadas. Pero el reciente descubrimiento de la transmisión el cólera por individuos asintomáticos hizo que en 1910 el Departamento Nacional de Higiene (DNH) impusiese un sistema de análisis bacteriológico obligatorio. En particular, el artículo examina las ideas y actividades de José Penna, quien en 1910 se desempeñaba como director del DGN y de Salvador Mazza. Un médico recién recibido, este último estuvo a cargo del laboratorio bacteriológico del lazareto de Martín García donde se sometía a estudio a todos los pasajeros de tercera clase provenientes de zonas infectadas de cólera. El DNH presentó la campaña anticolérica de 1910 como resultado de la experiencia acumulada durante el siglo XIX, del progreso científico y administrativo de la Argentina y de los esfuerzos de las autoridades por proteger a la nación. En un momento en que la elite argentina luchaba para mantener su dominio, tanto reprimiendo como buscando co-optar a la oposición, las cuestiones de salud pública constituyeron un elemento importante de la retórica política.
Resumo:
El concepto de muerte intervenida comprende todas aquellas situaciones clínicas en las que la abstención o retiro de algún método de soporte vital en el transcurso de la asistencia médica se propone como un límite en el tratamiento, quedando vinculado el mismo con la llegada de la muerte cardiorrespiratoria tradicional. La presencia de una tecnología médica (soporte vital) común en todos los casos, permite la consideración comprensiva de todo su conjunto. Finalmente, en todos ellos la muerte se vincula a una acción u omisión de la tecnología médica. Una evaluación común de estos casos no sólo tiene una lógica derivada de la operatividad que hizo posible la existencia de estos pacientes sino que además promueve la interpretación verdadera y correcta de los hechos por parte de la sociedad. La reflexión sobre la muerte intervenida como un fenómeno emergente de nuestra cultura resulta imprescindible para que la sociedad se involucre y participe en un tema que le incumbe en forma absoluta y exclusiva.
Resumo:
Resumen: Los avances científico-tecnológicos en neonatología en los últimos 40 años han permitido una importante mejoría en la sobrevida de recién nacidos de extremo bajo peso al nacer, sin embargo la mortalidad neonatal aun representa un porcentaje muy grande de la mortalidad infantil. Esto esta principalmente relacionado a las muertes por prematuridad y sus complicaciones, anomalías congénitas y asfixia perinatal. La mayoría de los recién nacidos son tratados favorablemente en sala de partos y son admitidos a la Unidad de Cuidados Intensivos Neonatales (UCIN). La incertidumbre en el pronóstico de los prematuros extremos en el límite de la viabilidad con alto riesgo de morir en la UCIN o presentar alguna discapacidad, presenta un difícil dilema ético. Se deberá considerar cada caso en forma individual y evaluar el riesgo-beneficio entre las conductas a seguir y el “mejor interés para el niño” y los deseos de los padres que guiarán a decisiones éticas. Diferentes guías de cuidado y variaciones en la práctica médica en los límites de la viabilidad fetal se han descripto dentro y entre países. El objetivo es proveer a los padres una comunicación abierta, directa y transparente con suficiente entendimiento de los factores más relevantes en relación a la situación clínica, el pronóstico y las opciones de tratamiento para que ellos puedan tener una significativa participación en la toma de decisiones. Aceptar que en neonatología, hacer todo lo que uno puede hacer puede ser perjudicial, no útil o beneficioso. No todo lo técnicamente posible es éticamente correcto. El dilema afecta tanto al origen de la vida como a la terminación de la vida.