731 resultados para Project Quality, Continuous Improvement, Stakeholder Management, Construction Project Delivery


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crop residues returned to the soil are important to preserve fertility and sustainability. This research addressed the long-term decomposition of sugarcane post-harvest residues (trash) under reduced tillage, therefore field renewal was performed with herbicide followed by subsoiling and ratoons were deprived of interrow scarification. The trial was conducted in the northern Sao Paulo State, Brazil during four consecutive crops (2005-2008) where litter bags containing N-15-labeled trash were disposed in the field attempting to simulate two distinct situations: the previous crop trash (PCT) or residues incorporated in the field after tillage, and post-harvest trash (PHT) or the remains of plant-cane harvest. Decomposition rates regarding dry matter (DM), carbon (C), root growth, plant nutrients (N, P, K, Ca, Mg and S), lignin (LIG) cellulose (CEL) and hemicellulose (HCEL) contents were assessed for PCT (2005 ndash;2008) and for PHT (2006-2008). There were significant reductions on DM and C:N ratio due to C losses and root growth within the litter bags over time. The DM from PCT and PHT decreased 96% and 73% after four and three crops, respectively, and the higher nutrients release were found for K, Ca and N. The LIG, CEL and HCEL concentrations in PCT decreased 60%, 29%, 70% after four crops and 47%, 35%, 70% from PHT after three crops, respectively. Trash decomposition was driven mainly by residues biochemical composition, root growth within the trash blanket and the climatic conditions during the crop cycles. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[ES] Se presenta el análisis de Calidad del Dato utilizado en la construcción de una herramienta de observación diseñada ad hoc. Se trata de un sistema mixto de formatos de campo y sistemas de categorías exhaustivas y mutuamente excluyentes (E/ME) que tiene como objetivo codificar la fase de ataque del balonmano playa. Se utilizan como criterios: minuto, marcador, zona de finalización y jugador que finaliza. Se han codificado 12 observaciones de selecciones nacionales absolutas masculinas. El análisis se ha realizado utilizando la concordancia consensuada (aproximación cualitativa de la calidad del dato), elaborando un archivo de detección de errores, calculando el índice Kappa de Cohen, los índices de correlación Tau-B de Kendall, Pearson y Spearman; y un análisis de Generalizabilidad. Los resultados de los coeficientes de correlación muestran un índice mínimo de .993, los índices Kappa de Cohen se sitúan en .917 y los índices de generalizabilidad son óptimos. Estos resultados aseguran que la herramienta de observación, además de tener un buen ajuste, permite registrar con fiabilidad y precisión.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of protein expression profiles for biomarker discovery in serum and in mammalian cell populations needs the continuous improvement and combination of proteins/peptides separation techniques, mass spectrometry, statistical and bioinformatic approaches. In this thesis work two different mass spectrometry-based protein profiling strategies have been developed and applied to liver and inflammatory bowel diseases (IBDs) for the discovery of new biomarkers. The first of them, based on bulk solid-phase extraction combined with matrix-assisted laser desorption/ionization - Time of Flight mass spectrometry (MALDI-TOF MS) and chemometric analysis of serum samples, was applied to the study of serum protein expression profiles both in IBDs (Crohn’s disease and ulcerative colitis) and in liver diseases (cirrhosis, hepatocellular carcinoma, viral hepatitis). The approach allowed the enrichment of serum proteins/peptides due to the high interaction surface between analytes and solid phase and the high recovery due to the elution step performed directly on the MALDI-target plate. Furthermore the use of chemometric algorithm for the selection of the variables with higher discriminant power permitted to evaluate patterns of 20-30 proteins involved in the differentiation and classification of serum samples from healthy donors and diseased patients. These proteins profiles permit to discriminate among the pathologies with an optimum classification and prediction abilities. In particular in the study of inflammatory bowel diseases, after the analysis using C18 of 129 serum samples from healthy donors and Crohn’s disease, ulcerative colitis and inflammatory controls patients, a 90.7% of classification ability and a 72.9% prediction ability were obtained. In the study of liver diseases (hepatocellular carcinoma, viral hepatitis and cirrhosis) a 80.6% of prediction ability was achieved using IDA-Cu(II) as extraction procedure. The identification of the selected proteins by MALDITOF/ TOF MS analysis or by their selective enrichment followed by enzymatic digestion and MS/MS analysis may give useful information in order to identify new biomarkers involved in the diseases. The second mass spectrometry-based protein profiling strategy developed was based on a label-free liquid chromatography electrospray ionization quadrupole - time of flight differential analysis approach (LC ESI-QTOF MS), combined with targeted MS/MS analysis of only identified differences. The strategy was used for biomarker discovery in IBDs, and in particular of Crohn’s disease. The enriched serum peptidome and the subcellular fractions of intestinal epithelial cells (IECs) from healthy donors and Crohn’s disease patients were analysed. The combining of the low molecular weight serum proteins enrichment step and the LCMS approach allowed to evaluate a pattern of peptides derived from specific exoprotease activity in the coagulation and complement activation pathways. Among these peptides, particularly interesting was the discovery of clusters of peptides from fibrinopeptide A, Apolipoprotein E and A4, and complement C3 and C4. Further studies need to be performed to evaluate the specificity of these clusters and validate the results, in order to develop a rapid serum diagnostic test. The analysis by label-free LC ESI-QTOF MS differential analysis of the subcellular fractions of IECs from Crohn’s disease patients and healthy donors permitted to find many proteins that could be involved in the inflammation process. Among them heat shock protein 70, tryptase alpha-1 precursor and proteins whose upregulation can be explained by the increased activity of IECs in Crohn’s disease were identified. Follow-up studies for the validation of the results and the in-depth investigation of the inflammation pathways involved in the disease will be performed. Both the developed mass spectrometry-based protein profiling strategies have been proved to be useful tools for the discovery of disease biomarkers that need to be validated in further studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The question whether patients suffering from end-stage emphysema who are candidates for lung transplantation should be treated with a single lung or with a double lung transplantation is still unanswered. METHODS: We reviewed 24 consecutive lung transplant procedures, comparing the results of 6 patients with an unilateral and 17 with a bilateral transplantation. PATIENTS AND RESULTS: After bilateral transplantation the patients showed a trend towards better blood gas exchange with shorter time on ventilator and intensive care compared patients after unilateral procedure. Three-year-actuarial survival was higher in the group after bilateral transplantation (83% versus 67%). There was a continuous improvement in pulmonary function in both groups during the first months after transplantation. Vital capacity and forced exspiratory ventilation therapies during the first second were significantly higher in the bilateral transplant group. CONCLUSION: Both unilateral and bilateral transplantation are feasible for patients with end-stage emphysema. Bilateral transplantation results in better pulmonary reserve capacity and faster rehabilitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What does it mean for curriculum to be interactive? It encourages student engagement and active participation in both individual and group work. It offers teachers a coherent set of materials to choose from that can enhance their classes. It is the product of on-going development and continuous improvement based on research and feedback from the field. This paper will introduce work in progress from the Center for Excellence in Education, Science, and Technology (CELEST), an NSF Science of Learning Center. Among its many goals, CELEST is developing a unique educational curriculum, an interactive curriculum based upon models of mind and brain. Teachers, administrators, and governments are naturally concerned with how students learn. Students are greatly concerned about how minds work, including how to learn. CELEST aims to introduce curricula that not only meet current U.S. standards in mathematics, science, and psychology but also influence plans to improve those standards. Software and support materials are in development and available at http://cns.bu.edu/celest/private/. Interested parties are invited to contact the author for access.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Part 1 of this article we discussed the need for information quality and the systematic management of learning materials and learning arrangements. Digital repositories, often called Learning Object Repositories (LOR), were introduced as a promising answer to this challenge. We also derived technological and pedagogical requirements for LORs from a concretization of information quality criteria for e-learning technology. This second part presents technical solutions that particularly address the demands of open education movements, which aspire to a global reuse and sharing culture. From this viewpoint, we develop core requirements for scalable network architectures for educational content management. We then present edu-sharing, an advanced example of a network of homogeneous repositories for learning resources, and discuss related technology. We conclude with an outlook in terms of emerging developments towards open and networked system architectures in e-learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter presents fuzzy cognitive maps (FCM) as a vehicle for Web knowledge aggregation, representation, and reasoning. The corresponding Web KnowARR framework incorporates findings from fuzzy logic. To this end, a first emphasis is particularly on the Web KnowARR framework along with a stakeholder management use case to illustrate the framework’s usefulness as a second focal point. This management form is to help projects to acceptance and assertiveness where claims for company decisions are actively involved in the management process. Stakeholder maps visually (re-) present these claims. On one hand, they resort to non-public content and on the other they resort to content that is available to the public (mostly on the Web). The Semantic Web offers opportunities not only to present public content descriptively but also to show relationships. The proposed framework can serve as the basis for the public content of stakeholder maps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The implementation of complementary and alternative therapies into conventional treatment schemes is gaining popularity. However, their use is widely depending on patients’ drive. This case-report focuses on a patient’s experience of the integration of WATSU (WaterShiatsu) in rehabilitative care. Methods: Patient: A 52 year old woman survived a severe motorcycle-accident in which she sustained several fractures on the right side of her body, including ribs, pelvis, and femur. After discharge from stationary care, she independently added WATSU to her rehabilitative regimen. Treatment approach: WATSU is a passive form of hydrotherapy in warm water that aims at relaxation, pain relief, and a sense of secureness. In the reported case, an experienced WATSU-therapist who is also trained in physiotherapy and psychosomatics delivered weekly sessions of one hour duration. Measures used: Qualitative data were collected by patient’s diary. Also the therapist’s notes including The Patient Specific Functional Scale (PSFS) were considered. Results: The patient associated WATSU with trunk mobilization (followed by ameliorated breath), reconciliation with her body, and emotional discharge. She ascribed WATSU lasting effects on her body-image. The therapist employed WATSU for careful mobilization and to equalize awareness throughout the body. The PSFS displayed continuous improvement in all categories except usage of public transportation. Due to complications (elevated inflammation markers) only 6 of 8 scheduled sessions were administered. Conclusions: WATSU was experienced helpful in approaching conditions that are difficult to address by conventional physiotherapy. In early rehabilitation, additional medical/physiotherapeutic skills of contributing complementary therapists are advocated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current climate of escalating health care costs, defining value and accurately measuring it are two critical issues affecting not only the future of cancer care in particular but also the future of health care in general. Specifically, measuring and improving value in cancer-related health care are critical for continued advancements in research, management, and overall delivery of care. However, in oncology, most of this research has focused on value as it relates to insurance industry and payment reform, with little attention paid to value as the output of clinical interventions that encompass integrated clinical teams focusing on the entire cycle of care and measuring objective outcomes that are most relevant to patients. ^ In this study, patient-centered value was defined as health outcomes achieved per dollar spent, and calculated using objective functional outcomes and total care costs. The analytic sample comprised patients diagnosed with three common head and neck cancers—cancer of the larynx, oral cavity, and oropharynx—who were treated in an integrated tertiary care center over an approximately 10-year period. The results of this study provide initial empirical data that can be used to assess and ultimately to help improve the quality and value of head and neck cancer care, and more importantly they can be used by patients and clinicians to make better-informed decisions about care, particularly what therapeutic services and outcomes matter the most to patients.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the transition period from a planned economy to a market economy in 1990s of China, there was a considerable accrual of deferred payment, and default due to inferior enforcement institutions. This is a very common phenomenon in the transition economies at that time. Interviews with home electronics appliance firms revealed that firms coped with this problem by adjusting their sales mechanisms (found four types), and the benefit of institutions was limited. A theoretical analysis claim that spot and integration are inferior to contracts, a contract with a rebate on volume and prepayment against an exclusive agent can realize the lowest cost and price. The empirical part showed that mechanisms converged into a mechanism with the rebate on volume an against exclusive agent and its price level is the lowest. The competition is the driving force of the convergence of mechanisms and improvement risk management capacity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En esta tesis se presenta una nueva aproximación para la realización de mapas de calidad del aire, con objeto de que esta variable del medio físico pueda ser tenida en cuenta en los procesos de planificación física o territorial. La calidad del aire no se considera normalmente en estos procesos debido a su composición y a la complejidad de su comportamiento, así como a la dificultad de contar con información fiable y contrastada. Además, la variabilidad espacial y temporal de las medidas de calidad del aire hace que sea difícil su consideración territorial y exige la georeferenciación de la información. Ello implica la predicción de medidas para lugares del territorio donde no existen datos. Esta tesis desarrolla un modelo geoestadístico para la predicción de valores de calidad del aire en un territorio. El modelo propuesto se basa en la interpolación de las medidas de concentración de contaminantes registradas en las estaciones de monitorización, mediante kriging ordinario, previa homogeneización de estos datos para eliminar su carácter local. Con el proceso de eliminación del carácter local, desaparecen las tendencias de las series muestrales de datos debidas a las variaciones temporales y espaciales de la calidad del aire. La transformación de los valores de calidad del aire en cantidades independientes del lugar de muestreo, se realiza a través de parámetros de uso del suelo y de otras variables características de la escala local. Como resultado, se obtienen unos datos de entrada espacialmente homogéneos, que es un requisito fundamental para la utilización de cualquier algoritmo de interpolación, en concreto, del kriging ordinario. Después de la interpolación, se aplica una retransformación de los datos para devolver el carácter local al mapa final. Para el desarrollo del modelo, se ha elegido como área de estudio la Comunidad de Madrid, por la disponibilidad de datos reales. Estos datos, valores de calidad del aire y variables territoriales, se utilizan en dos momentos. Un momento inicial, donde se optimiza la selección de los parámetros más adecuados para la eliminación del carácter local de las medidas y se desarrolla cada una de las etapas del modelo. Y un segundo momento, en el que se aplica en su totalidad el modelo desarrollado y se contrasta su eficacia predictiva. El modelo se aplica para la estimación de los valores medios y máximos de NO2 del territorio de estudio. Con la implementación del modelo propuesto se acomete la territorialización de los datos de calidad del aire con la reducción de tres factores clave para su efectiva integración en la planificación territorial o en el proceso de toma de decisiones asociado: incertidumbre, tiempo empleado para generar la predicción y recursos (datos y costes) asociados. El modelo permite obtener una predicción de valores del contaminante objeto de análisis en unas horas, frente a los periodos de modelización o análisis requeridos por otras metodologías. Los recursos necesarios son mínimos, únicamente contar con los datos de las estaciones de monitorización del territorio que, normalmente, están disponibles en las páginas web viii institucionales de los organismos gestores de las redes de medida de la calidad del aire. Por lo que respecta a las incertidumbres de la predicción, puede decirse que los resultados del modelo propuesto en esta tesis son estadísticamente muy correctos y que los errores medios son, en general, similares o menores que los encontrados con la aplicación de las metodologías existentes. ABSTRACT This thesis presents a new approach for mapping air quality, so that this variable of physical environment can be taken into account in physical or territorial planning. Ambient air quality is not normally considered in territorial planning mainly due to the complexity of its composition and behavior and the difficulty of counting with reliable and contrasted information. In addition, the wide spatial and temporal variability of the measurements of air quality makes his territorial consideration difficult and requires georeferenced information. This involves predicting measurements in the places of the territory where there are no data. This thesis develops a geostatistical model for predicting air quality values in a territory. The proposed model is based on the interpolation of measurements of pollutants from the monitoring stations, using ordinary kriging, after a detrending or removal of the local character of sampling values process. With the detrending process, the local character of the time series of sampling data, due to temporal and spatial variations of air quality, is removed. The transformation of the air quality values into site-independent quantities is performed using land use parameters and other characteristic parameters of local scale. This detrending of the monitoring data process results in a spatial homogeneous input set which is a prerequisite for a correct use of any interpolation algorithm, particularly, ordinary kriging. After the interpolation step, a retrending or retransformation is applied in order to incorporate the local character in the final map at places where no monitoring data is available. For the development of this model, the Community of Madrid is chosen as study area, because of the availability of actual data. These data, air quality values and local parameters, are used in two moments. A starting point, to optimize the selection of the most suitable indicators for the detrending process and to develop each one of the model stages. And a second moment, to fully implement the developed model and to evaluate its predictive power. The model is applied to estimate the average and maximum values of NO2 in the study territory. With the implementation of the proposed model, the territorialization of air quality data is undertaken with the reduction in three key factors for the effective integration of this parameter in territorial planning or in the associated decision making process: uncertainty, time taken to generate the prediction and associated resources (data and costs). This model allows the prediction of pollutant values in hours, compared to the implementation time periods required for other modeling or analysis methodologies. The required resources are also minimal, only having data from monitoring stations in the territory, that are normally available on institutional websites of the authorities responsible for air quality networks control and management. With regard to the prediction uncertainties, it can be concluded that the results of the proposed model are statistically very accurate and the mean errors are generally similar to or lower than those found with the application of existing methodologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two complementary benchmarks have been proposed so far for the evaluation and continuous improvement of RDF stream processors: SRBench and LSBench. They put a special focus on different features of the evaluated systems, including coverage of the streaming extensions of SPARQL supported by each processor, query processing throughput, and an early analysis of query evaluation correctness, based on comparing the results obtained by different processors for a set of queries. However, none of them has analysed the operational semantics of these processors in order to assess the correctness of query evaluation results. In this paper, we propose a characterization of the operational semantics of RDF stream processors, adapting well-known models used in the stream processing engine community: CQL and SECRET. Through this formalization, we address correctness in RDF stream processor benchmarks, allowing to determine the multiple answers that systems should provide. Finally, we present CSRBench, an extension of SRBench to address query result correctness verification using an automatic method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mercado de outsourcing ha estado creciendo en los últimos años y se prevé que lo siga haciendo en los próximos, pero este crecimiento ha estado limitado por el fracaso de muchos proyectos que, en algunos casos, han llevado a las organizaciones a asumir de nuevo esos servicios (insourcing). Estos fracasos se han debido en gran parte a los problemas con los proveedores: falta de experiencia, de capacidades para asumir los proyectos, dificultad en la comunicación. Así como hay marcos de buenas prácticas para la gestión de los proyectos de outsourcing para los clientes, no ocurre lo mismo con los proveedores, que basan la provisión de los servicios en sus experiencias anteriores y en sus capacidades técnicas. El objetivo de este artículo es demostrar la necesidad de proponer una metodología que guíe a los proveedores durante todo el ciclo de vida un proyecto de outsourcing y que facilite la provisión de servicios de calidad y bien gestionados. ABSTRACT. The outsourcing market has been growing in recent years and it is expected to keep doing it, but this growth has been limited by the failure of many projects. These failures have been due to a major degree to problems with providers: lack of experience and capacity to take on the projects and difficult communication. There are good practices frameworks for managing outsourcing projects for clients, but it is not the same for providers, who base the provision of services on their past experience and technical capabilities. The aim of this paper is to state the need to propose a methodology that guides providers throughout the whole outsourcing life cycle and facilitates the provision of quality services and their management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mercado de outsourcing ha estado creciendo en los últimos años y se espera que lo siga haciendo, pero este crecimiento se ha visto limitado por el fracaso de muchos proyectos. Estos fracasos se han debido en gran parte a los problemas con los proveedores: la falta de experiencia y capacidad para asumir los proyectos, y una comunicación difícil. Se han propuesto marcos de buenas prácticas para la gestión de proyectos de outsourcing desde el punto de vista del cliente, pero no ha sido así para los proveedores, que basan la prestación de servicios en su experiencia pasada y sus capacidades técnicas. El objetivo de este trabajo es establecer la necesidad de proponer una metodología que guíe a los proveedores a lo largo de todo el ciclo de vida del outsourcing y facilite la provisión de servicios de calidad y bien gestionados. ABSTRACT. The outsourcing market has been growing in recent years and it is expected to keep doing it, but this growth has been limited by the failure of many projects. These failures have been due to a major degree to problems with providers: lack of experience and capacity to take on the projects and difficult communication. There are good practices frameworks for managing outsourcing projects for clients, but it is not the same for providers, who base the provision of services on their past experience and technical capabilities. The aim of this paper is to state the need to propose a methodology that guides providers throughout the whole outsourcing life cycle and facilitates the provision of quality services and their management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mercado de outsourcing ha estado creciendo en los últimos años y se prevé que lo siga haciendo en los próximos, pero este crecimiento ha estado limitado por el fracaso de muchos proyectos que, en algunos casos, han llevado a las organizaciones a asumir de nuevo esos servicios (insourcing). Estos fracasos se han debido en gran parte a los problemas con los proveedores: falta de experiencia, de capacidades para asumir los proyectos, dificultad en la comunicación,… A diferencia de lo que ocurre en otras disciplinas, no existe una metodología que ayude, tanto a los clientes como a los proveedores de servicios de outsourcing de TI, a gobernar y gestionar sus proyectos y conseguir los resultados buscados. En los últimos años han aparecido, al mismo tiempo que la expansión del outsourcing, algunos modelos y marcos de buenas prácticas para la gestión de los proyectos de outsourcing, pero generalmente sólo cubren algunos aspectos de la gestión. No se los puede considerar metodologías, porque no definen roles, responsabilidades ni entregables. Por lo general, son el resultado de la experiencia en la gestión de otros tipos de proyectos. Hay que considerar también que, excepto eSCM-SP, que es un modelo de buenas prácticas para mejorar la capacidad en la provisión de servicios, están todos orientados al cliente. El objetivo de esta tesis es, por un lado, demostrar la necesidad de contar con una metodología que guíe a los proveedores durante todo el ciclo de vida un proyecto de outsourcing y, por otro, proponer una metodología que contemple desde la fase inicial de la búsqueda de oportunidades de negocio, evaluación de las propuestas RFP, la decisión de hacer una oferta o no para la prestación de servicios, la participación en la due diligence, la firma del contrato, la transición y la entrega de servicios, hasta la finalización del contrato. La metodología se ha organizado en base a un ciclo de vida del outsourcing de cinco etapas, definiendo para cada una de ellas los roles que participan y las responsabilidades que deberán asumir, las actividades a realizar y los entregables que se deberán generar, y que servirán de elementos de control tanto para la gestión del proyecto como para la provisión del servicio. La validación de la metodología se ha realizado aplicándola en proyectos de provisión de servicios de TI de una mediana empresa española y comparando los resultados obtenidos con los conseguidos en proyectos anteriores. ABSTRACT The outsourcing market has been growing in recent years and it is expected to keep doing so in the coming years, but this growth has been limited by the failure of many projects that, in some cases, has led organizations to take back those services (insourcing). These failures have been due to a major degree to problems with providers: lack of experience and capacity to take on the projects, and difficulties of communication. Unlike what happens in other disciplines, there is no methodology for helping both customers and providers of outsourcing services. In recent years, some good practice frameworks have also appeared at the same time as the expansion of outsourcing. They are not methodologies because they have not defined any roles, responsibilities and deliverables. These frameworks aim to help organizations to be successful at managing and governing outsourcing projects. They are usually the result of their experience in managing other kinds of projects. In consequence, it is not appropriate to name them "methodologies" for managing outsourcing projects and much less "standards". It is also important to note that all existing good practice frameworks, except eSCM-SP, are client-oriented. The aim of this thesis is to state the need to propose a methodology that guides providers throughout the whole outsourcing life cycle and facilitates the provision of quality services and their management, and the proposal of a methodology in which the stages, activities, deliverables, roles and responsibilities are clearly defined. The proposed methodology cover all the stages of the outsourcing life cycle, from the early stage of searching for business opportunities, evaluation of the RFP proposals, the decision to bid or not to bid for the service provision, participation in the due diligence if necessary, the signing of the contract, the transition and delivery of service to the termination of the contract. For each activity, roles, responsibilities and deliverables have been defined. The validation of the methodology has been done by applying it in the provision of some outsourcing projects carried out by a Spanish IT medium company and comparing the results with those obtained in previous projects.