Wednesday, November 28, 2018


What is the Definition of Fair Trade?


Fair trade is one of the most burning issues dragging economic integration in the world. Achieving fair trade requires that nation agree and enforce what is taking too long to incorporate into the current WTO framework.

· Regional trade blocs like MERCOSUR , ASEAN, OATUU and others need to embrace international standards. UNCATAD is the international agency leading the way for collecting trade data on imposed tariff measures that cover most countries and can be freely disseminated.

· A burning issue in the markets today, is the lack of transparency on trade regulations by country. “Drawer” regulations that are made on the fly at border crossings impose a hidden cost on trade specially, in underdeveloped countries.

· In Africa, many exporters lose sometimes half of their potential export earnings because European Union regulations are different from the international standards set by the International Organization for Standardization. By adopting international standards of global best practices in trade should result in the promotion of sustainable development while decreasing negative impact on the environment.

· Adopting standard rules avoids the burden of red tape imposed by each country’s regulatory regime. The rules and guidelines are already available on the issue and embedded in WTO and OECD rules but the overall application of these set principles is in many cases missing.

· Procedural requirements at border crossing need to have technical assistance and training of law enforcers so that countries join together and accepted the rules of trade in order to streamline each country’s regulatory regimes and thus, reduce procedural obstacles.

Trust issues

Public health and environmental protection, has been the backlash against globalization and a growing influence of elites protecting their own turf to the detriment of everyone else. Moreover, just by simply reducing barriers or reducing restrictions to trade does not have linear correlation specially, when the mix includes influential politicians or well-connected elites. At the end of the day countries must evaluate if non-tariff measures to trade like subsidies are legitimate or be used as trade offs to bring about trade fairness and efficiency. Only then adopting these complementary policies, can individuals and the markets have credibility and fairness and become the vibrant drivers of jobs and incomes.

“The World Economic Forum’s E15 Initiative has emphasized the importance of efficient global trade in fostering economic growth. The scale and complexity of the modern, globalized, system is made clear by visualizations such as these, of global shipping.”

TTIP is a trade agreement currently being negotiated by the US and EU that would bring tariffs and regulatory barriers to minimum levels to transatlantic trade and investment. The goal is that each side of the Atlantic seaboard will give access to their companies to each other markets with standardized regulations and procedures.

It has been reported that the US and EU countries together represent $1 trillion in trade every year. This agreement would cover 45% of global GDP, making the TTIP the world’s largest trade agreement which would include pharmaceuticals, automotive, energy, finance, chemicals, clothing and food and drink among others.

The Internet is taking trade to a new dimension into what is now becoming push button trade in a fast pace environment which requires factories close to the markets and new distribution cyber platforms. The trend is provoking changes in the market and corporations are adapting by using multi-layer global platforms and supply chains. The age of digital transactions is here to stay in a fast economy that in the past tended to centralize for better management and quality control. But cloud computing is changing all that generating new level of cooperation between producers the supply chain and the ultimate consumer.

Big data management is surging not only as a new field in science but in commerce as well as giants like Alibaba, eBay, Amazon and the such apply this new technology not only as a point of sale but also to determine consumer preferences and tastes as new “learning” algorithms enter the market like autonomous driving cars.

This expanding global markets has been aided by the lowering cost of shipping transportation for long distance given new markets access to otherwise unreachable opportunity for smaller manufacturers and cottage industries providing consumers with more choices and prices.

Adapting and retooling is the new rule of global markets, re-engineering the supply chain, big data analysis and Internet cloud platforms are bringing new realities to trade beyond Bretton Woods and WTO. As a result, uninformed politicians like Trump should smell the coffee and bring the US to the new dawn of reality as international trade evolves without the intervention of hard headed and ignorant politicians.


Thursday, November 1, 2018

The life cycle of data varies with the needs of a particular enterprise: For instance the analysis of a flight data recorder’s life cycle ends with the one flight. But if a comparative analysis needs to be completed among several flights over a period of time the life cycle becomes flexible. In general, data life cycle management (DLM) is a policy-based for a particular enterprise as it manages the flow of an information system goes through its life cycle starting with recording the data points, classification, analysis and storage for its usefulness until time dictates the data has become obsolete and is deleted.
How is data integrated into the IT value chain is again particular to each enterprise’s needs. In general it can be defined as a series of activities that an enterprise performs in order to deliver its product or service. Either a product or a service must move through a chain of events before is delivered adding value at each step of the process. The value chain framework is designed for each activity in particular but in general is divided in two main categories:
1- Primary actions for production or delivery of goods or services for a business to be and function in a socio-economic environment
2- Supporting activities like logistics or financial needs which assist in providing efficiency of the primary activities as the they move through the value chain.
Quality control over data is increasingly important for organizations that make data driven decisions. However, several measures are essential for these activities as the expansion and management of data flow become challenging.
Presently, many organizations have an increasing demand for high quality data as the bar rises for analysis techniques and the availability of quality data, is demanded in order to comply with new regulations and legislation. However, this demand for quality data also implies quality sourcing not limited to the data residing in the organization’s IT system
High data quality is also demanded for the improvement of organizational performance, logistics support, growth, competitive advantage and compliance with the growing need of data collection regulations.
Various sectors of the economy are subject to stricter regulations like medical devices, financial services, telecommunication, pharmaceutical, consumer markets and others that collect personal information that are the subject of privacy legislation.
Data complexity and growth is also a challenge for an organization where it is unclear the understanding of data quality and that data management is an IT department responsibility rather than a business side responsibility.
Unfamiliarity with collection methods within the organization such as robotic operated processes, especially if data is transported and transformed as it moves through the chain. Particularly when transformations are complex, it can require an IT specialist to determine which data elements belong to one another.
The inherited complexity of tracking data increases in companies with multipolar IT environments caused by many legacy systems that need improvement or replacement from its existing reporting flow. A known factor is that the more computing is required within a flow, the more complicated it is to capture and interpret its meaning