Pages

Translate this blog to many language

Wednesday, April 24, 2013

New Types of Income from the Internet: New Methods to Earn Money Online


New Types of Income from the Internet: New Methods to Earn Money Online

Surely, everyone of you’ll wish to earn some extra

Money. And, you will see that the net will find a way

To supply it for you personally. Nevertheless, all of the on the web

A lot that has been made by businesses we all know so well of

people rich are now very busy. Plenty of people

are trying to do the same thing, and only several succeed

and a lot of them fail. The easiest way to create money

From the web today is through new types of revenue

generating companies from the web.

New online businesses are coming out everyday. All

you need to complete is know how to place them. Therefore, listed below are

A few of the new trends in the internet that’s known

to let a lot of people build an income.

The foremost is by becoming an on the web investment or FX broker.

In these times, as a result of the net, folks are now ready

To buy the stock and FOREX industry. They’re now

In a position to trade right in the advantages of these own home.

If something was known by you about trading in the stock

and FOREX market, then this online work or business may

be right for you personally. Not just that it’s the potential

in making you lots of money, but it also provides

you with ways to earn some cash regardless of your entire day

job.

The best thing relating to this web business is that you

dont really need to have thousands and a large number of

dollars in cash to take a position. Despite having just a hundred

Pounds, you will be able to begin stock trading or

in forex.

To start, you have to have a computer or laptop with

A dynamic broadband net connection. Then, you

Need to open a free account and deposit a minimum amount

of money that’ll be employed for trading. You’ll see

That we now have quite a bit of on line stock brokerage

Internet sites as possible sign up with. Always select the

stock brokerage business that has a good reputation and

is experienced with on the web investment or FOREX trading.

If you’d like to deal yourself, then practicing is the

Essential to getting additional money. You have to remember that

Investment and FOREX trading could possibly get quite complex.

This is why you could have to use and try some of the

Share and FOREX simulators that a lot of online brokerage

Organizations offer.

Still another web business that may guarantee letting you

earn some money is by being an outsourced call center

Adviser. The best thing about any of it is that not just

Is there a great potential in allowing you to make a

Fortune, but it will even allow you to work right at

the benefits of your house. You’ve to understand

that most companies are actually outsourcing their help

desk or their customer support services to

individuals, such as for instance your self.

If you curently have a pc that’s high

performance as well as a dynamic internet connection,

You’ll find most of the necessary application

to simply take calls from their customers.

These are two new types of getting money from the

internet. With your two, you can be sure you

Will have a way to make some extra cash. In fact, if you

become proficient at it, you can also consider as your this

regular job.
work from home jobs investigation

Comments are close

Different Types Of Internet Marketing


Different Types Of Internet Marketing


You are able to use this info to determine whether the Internet advertising organization inside India meets the individual online marketing requirements and needs or not. We will moreover ask a neighbors or company associates if they learn of any advantageous online marketing company inside India. Visiting online forums is moreover helpful inside acquiring reliable web advertising firms inside India.
The best thing regarding internet marketing is that each stage of the campaign is traced and tracked to the sorts of individuals that it must be attracting – age groups, gender, places in the planet etc. Once this information has been accumulated, more is completed to motivate the groups of individuals it is not attracting plus consequently offer more potential customers. There are a few downfalls to this style of marketing, though these are greatly outweighed by the wise points. For one, not everyone in the globe has access to a computer, allow alone the web, and then you must permit for the individuals that only have low speed internet connections.
Internet marketing is nothing however marketing the treatments plus selling them found on the World Wide Internet. If you have just started your online business and have built an effective url, it really is time to use some amazing marketing techniques to bring website to prominence.
Do remember to ask the internet firm India to provide you with references or portfolio of their previous projects. Talk to a few of their past clients and find out when they are happy with the services provided by the business plus might they suggest them to you and to others.
search engine marketing is the expression used to describe promotion and advertising which generates traffic from Search Engines. Marketing websites by Search Engines allows qualified traffic to be produced, this might be considering folks have already typed a criteria into the Search Engine. Their criteria entered is queried from the Search Engines Database and relevant sites might be displayed.
The most elementary ways of doing internet advertising is by the SEO articles. This really is all regarding keyword promoting and how the content of the post can improve the website. We should be effectively aware of the technique of utilizing the SEO articles properly.
This really is absolutely a strong foundation however, might too take this research to the upcoming level by incorporating keyword research. This will include having to determine the key terms employed by industry leaders, competitors plus most crucial, a target market. The internal research can fuel the creative back-end of your collateral generation, if inside print, whilst the keyword compilation might help the meta info and content generation for the website – laying the groundwork for the much publicized need for SEO (search engine optimization).
There is more information available on talk fusion complaints look at Travis N. Beadnell’s site there is a lot of information not outlined in this post, visit Author’s website to locate extra information.

Types of Web Hosting


Web hosting is the service offered by companies to have your website placed on the internet. These companies are called web hosts or web hosting providers. Your website can be a personal one or a business venture. Obviously, the needs for different websites vary. A personal website can have smaller needs as compared to a business website that needs more functions and services. Nevertheless, web hosting providers have made putting up websites easier.
Types of Web HostingNowadays, there are several web hosting providers online. Each is trying to outdo others by offering better services and packages. The cost is another area in which web hosts try to outdo each other. Well, all these competitions will eventually work best for you, the customer. What you can do to get the best is to know what these web hosts are offering.
Do you know that there are different kinds of web hosting? You may need the following information if you are planning to put up your own website.
Read on and be enlightened.

Free Web Hosting

This is for you if you just want to have a personal website. You can use this website to showcase family events, hobbies and other personal matters. The services offered are limited so don’t expect that you can build an elaborate and sophisticated website from this. Likewise, don’t expect many visitors to your website because you do not have your own domain name. Domain name is your personal address in the web. Internet users use this domain name to go to your website. Since your domain name is generic, people will find it hard to find your website. However, you can inform family members and personal friends about your website. Then they can visit it and see what you have in there.
More so, technical support and security options are limited. Expect a few or limited software options for you. At the most, you will get a free email from the web host. Nevertheless, the feeling of success you get from having your own website can offset the limited options provided by this kind of web hosting.

Shared Or Virtual Hosting

If you are planning to put up a small online business, then this type of hosting can be for you. You will get your own domain name. With your domain name, possible clients can easily find your website and find out what you have to offer them. Multiple software solutions are provided which you can use in building and managing your website more easily. The support in this type of web hosting is good. This means that you can rely on them to provide assistance to problems you might encounter in running your website. This type of web hosting is cost effective since you will be sharing a server with other websites.
As a result of this setup, security will be greatly reduced. This means that spammers or hackers can illegally access your website. More so, there are restrictions on the volume of traffic. Database and software supports are restricted.

Dedicated Hosting

If you have the money to spend, you can get this type of web hosting. This is expensive but you can get many advantages that go with the high price. For one, you will have multiple domains which can attract larger traffic.  You get powerful email solutions that can greatly help in the marketing aspect of your online business. You will also be provided with unlimited software support as well as strong database support. You must be knowledgeable enough to make full use of all these features.

Collocated Hosting

This type of web hosting allows you to have your own server thus this is expensive. You get high bandwidth, unlimited software options, higher security measures and reliable up-time. The web host is responsible for the security and maintenance of your server.

i have two web services with two complex types with identical structures and a different names. how do ensure compatibility with bpel?


I am trying to coordinate two different web services with bpel. I want to take the output array of integers from one service and feed it to the next web service as input. I am doing with this with a bpel assign construct which says that the two complex types are not compatible.
Here are the relevent parts of the WSDLs:
Person Position Info returns an array of integers as a complex type:
<wsdl:message name="personPositionInfoResponse">
<wsdl:part name="parameters" element="ns:personPositionInfoResponse"/>
</wsdl:message>

<xs:element name="personPositionInfoResponse">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="return" type="xs:int"/>
</xs:sequence>
</xs:complexType>
</xs:element>
Position Skill Management takes an array of integers as input:
<wsdl:message name="positionSkillRequest">
<wsdl:part name="parameters" element="ns:positionSkill"/>
</wsdl:message>

<xs:element name="positionSkill">
<xs:complexType>   
<xs:sequence>
<xs:element minOccurs="0" name="positionID" type="xs:int"/>
</xs:sequence>
</xs:complexType>
</xs:element>
This is my best idea so far:
    <bpel:copy>
        <bpel:from>
            <![CDATA[$personalInfoServiceOutput.parameters]]>
        </bpel:from>
        <bpel:to >
            <![CDATA[$positionSkillManagementInput.parameters]]>
        </bpel:to>
    </bpel:copy>
returns...
The from-spec of "" is not compatible with to-spec of "" - Element in httplocalhost...:8080/axis2/services/PersonalInfoService?wsdl differs from in httplocalhost...:8080/axis2/services/PositionSkillManagementService?wsdl - different QNames: ns:return vs ns:positionID

Web technology: 5 things to watch in 2013


The evolution of the Web is a messy process.
We do so much with the Web today that it's easy to take it for granted. Banking, social networking, word processing, travel planning, education, shopping -- the Web is reaching to new domains and tightening its grip where it's already used. To match that expansion, the Web is evolving.
But the Web is built by countless individuals -- browser engineers who enable new technology, Web developers who bring that technology online, and standards group members who iron out compatibility wrinkles. With so many constituents, it's no wonder there's so much craziness in charting the Web's future.
The new year will bring new chaos on the Web, and things will be sorted out in only some areas. Here's a look at what'll settle down in 2013 -- and what won't.
Alternabrowsers
iOS comes with Safari. Windows Phone comes with Internet Explorer. Android comes with its own browser and, for Android 4.x users, Chrome. It's a very different way of doing things compared to the browser free-for-all in the PC market.
In 2013, though, there's a chance people will exercise choice where they can and reject a future where browsers end up being effectively locked to the mobile OS.
The forces for lock-in are strong, if for no other reason that it's just simpler to use a smartphone's built-in browser. But don't forget -- there was a day when IE ruled the desktop browser world. In 2012, programmers laid the groundwork for big-name alternabrowsers.

Today, the companies that control the mobile operating systems -- Apple and Google -- lead the race for mobile browser usage.
Today, the companies that control the mobile operating systems -- Apple and Google -- lead the race for mobile browser usage.
(Credit: data from Net Applications; chart by Stephen Shankland/CNET)
We saw the arrival of Chrome on iOS and the reboot of Firefox on Android. iOS and Windows Phone place restrictions on third-party browsers, but Android is open, and other browsers there include Dolphin, Opera Mini, Opera Mobile, and UC Browser.
The restriction on iOS is that third-party browsers must use an Apple-supplied version of the WebKit browser engine that's more secure but slower than the version Safari uses. Windows Phone and Windows RT have related restrictions.
On personal computers, it's completely ordinary to switch to other browsers depending on security, performance, features. In the mobile world, that's not the case.
But the alternative browsers -- especially when companies like Google put marketing muscle and brand equity behind them -- could convince people that maybe they should venture farther afield. With Android spreading into more hands than iOS, it's possible the openness of the PC industry could
Oh, one more thing -- don't be surprised to see a Mozilla browser on iOS, too.
Firefox OS makes a peep
Mozilla announced some early progress with Firefox OS in 2012 -- though it failed to deliver it during the year as it had planned. Expect the browser-based operating system, which runs Web apps and is geared for budget smartphones, in early 2013.

The first big Firefox OS partner is Telefonica, which plans to offer phones in Latin America with the operating system as a cheaper smartphones alternative.Firefox is barred from iOS and Windows RT, and it is a rarity on Android. Without a presence in the mobile market, Mozilla can't use its browser as leverage to pursue its goal of an open Internet. Firefox OS, geared for smartphones and running browser-based apps, is Mozilla's answer. With it, Mozilla hopes to break the ecosystem lock that is settling people into the phone-OS-app store-cloud service silos from Apple, Google, Microsoft, and Amazon.
"Mozilla's prediction is that in 2013, the Web will emerge as a viable mobile platform and a third, alternative option to closed, proprietary walled gardens," said Jay Sullivan, Mozilla's vice president of products. Firefox and Firefox OS obviously are key parts of Mozilla's effort to make that happen
Firefox OS won't be an easy sell since inexpensive Android phones are common and iPhones continue to spread. But carriers can't be happy ceding power to Google and Apple. And Mozilla doesn't need to have 40 percent market share to claim victory: as long as its foothold is big enough to keep Web programmers from coding mobile sites only for the big boys.

Web standards divisiveness persists
Those hoping the end of a rift in Web standards governance most likely will have to keep on waiting.
The new frontier of emerging Web standards is populated by a hodge-podge of acronyms.
The new frontier of emerging Web standards is populated by a hodge-podge of acronyms.
(Credit: Bruce Lawson)
The World Wide Web Consortium long has played a central role in revising the standards out of which the Web is built, but a decade ago it chose to push a standard called XHTML that wasn't compatible with HTML. The browser makers, it turned out, had veto power, and largely ignored XHTML in favor of advancing HTML on their own through a group called WHATWG. This split persists -- and it's not going away.
The W3C is enthusiastic about HTML and related Web standards such as CSS for formatting. But even as it's ramped up its efforts, with plans to finish HTML5 standardization in 2014, the WHATWG has moved to a "living document" model that constantly updates HTML.
W3C CEO Jeff Jaffe has been trying to speed up Web standardization, with some success, and the W3C has remained relevant when it comes to CSS and some other work. But it has yet to fully regain its status with HTML itself, despite new members, new editors, and new energy. In fact, the cultural gulf in some ways appears to be widening. Even as the W3C's formal committee machinations expand with new members, the WHATWG's HTML editor, Ian Hickson, is moving the other direction. He said in a Google+ post:

Consensus (also known as "design by committee") is a terrible way to design a language or platform. Committee design dilutes responsibility and blame (everyone just starts saying things like "yeah, I didn't like it, but we had to do that to get consensus") while letting everyone take credit for everything (since their ok is necessary to get consensus), which makes it an attractive proposition for people who want to further their careers without really doing any work...
You end up with a technology that doesn't know what it is and doesn't do anything well.
Web standards continue to evolve, but at least regarding HTML itself, it doesn't look like either side will agree the other has the superior process.
High-res images on the Web
Apple's Retina displays -- the high-resolution screens used in iPhones, iPads, and MacBooks -- enable a new level of crispness and clarity in images and text. Software makers have been gradually updating their programs with new icons, graphic elements, and abilities to take advantage of the displays. It's been work, but not exactly a major re-engineering effort.
The W3C&#39;s new HTML5 logo stands for more than just the HTML5 standard.
The W3C's new HTML5 logo stands for more than just the HTML5 standard.
(Credit: W3C)
But Retina on the Web is a very different matter. First of all, nobody likes slow-loading pages, and Retina imagery has four times the pixels as conventional imagery. Worse, more of the Web is moving toward mobile devices that have an even harder time managing big images and whose data usage is pricey, and you especially don't want mobile users downloading multiple versions of the same image when they don't need to.
At the same time, mobile devices are often held closer to the eye than PCs but using physically smaller screens with higher pixel densities. That means old assumptions no longer are valid about how many pixels wide a graphic should be. The technology to fix this has the label "responsive images."
Standards to the rescue! But uh-oh: Two camps each favor their own approach -- one calledt he srcset attribute, the other known as the picture element.
Resolution probably will come in 2013, though.
There have been emotional differences of opinion, but Robin Berjon, one of the five new HTML editors at the W3C, sees discussions as fruitful now. He said in a blog post:
We have two proposals for responsive images, the srcset attribute and the picture element. Both have now reached the level of maturity at which they can be most usefully compared, and this discussion is about to go through a new chapter.
Browser makers and Web developers are actively moving to high-resolution graphics and videos on Retina-capable devices, so regardless of what happens in standards groups, the responsive images issue will be fixed. After all, high-resolution displays are increasingly common, mobile devices are increasingly important, and nobody likes looking at pixelated, mushy images when they don't have to.
Web bloat
The good news is the Web is getting steadily more sophisticated, powerful, and useful. The bad news is there's a price to pay for those advantages. Unfortunately for those who have capped data plans or who live in rural areas with subpar broadband, that increase in Web sophistication means Web pages get bigger and take longer to fetch.
The HTTP Archive&#39;s records show a steady increase in the size of Web pages over the last two years.
The HTTP Archive's records show a steady increase in the size of Web pages over the last two years.
(Credit: HTTP Archive)
There's an old adage in the computing industry that the new horsepower that chips deliver is immediately squandered by new software features, so computers don't actually appear to get faster. There's a corollary in the Web world: As broadband spreads and speeds up, as faster LTE supplants 3G, so Web pages sponge up the extra network capacity.
The HTTP archive keeps tabs on the state of the Web, and it shows just how things are ballooning in its sample of tens of thousands of Web pages.
From December 16, 2010 to December 15, 2012, the average Web page increased in size from 726KB to 1,286KB. The amount of JavaScript increased from 115KB to 211KB. And the images ballooned from 430KB to 793KB.
An optimist can find good news here, too. Google has an entire team devoted to making the Web faster, introducing new technology such as SPDY to speed up servers and browsers. Browser makers obsessively test new versions to try to catch any regressions that would slow things down. New standards make it easier for Web developers to time exactly how fast their pages actually load.
And don't forget the bloat is there for a reason. Do you really want to dial the Web back to 1997?

Social media’s 2.0 moment: Responsiveness beats planning


Social media’s 2.0 moment: Responsiveness beats planning
The social web is pressuring organizations to accelerate all forms of communications.


O’Reilly Media delivered a counter-cultural (at the time) message: The dot-com bubble had burst, but the web was here to stay as an economic and social force. The meme they coined was Web 2.0, and their manifesto was captured in a seminal blog post by Tim O’Reilly. Web 2.0 was not meant to indicate a version number, but to point out the deep, persistent patterns of the web that were rewiring business and society.

I led the consulting practice at O’Reilly Media after we coined the term Web 2.0, and I think we now find ourselves at a similar (though softer) inflection point. There are a lot of valid questions regarding the business models in social: Is Facebook not a scalable vehicle for advertising and thus overvalued? Is Groupon bad for merchants and thus doomed to fail? Was social gaming (and Zynga) overhyped?

Taking a cue from Web 2.0, I believe we need to look beyond specific applications of social media — even, God forbid, specific platforms like Facebook — in order to sort out the underlying design patterns that will endure and continue to disrupt marketing and communications.

So what are those design patterns? Here are four:

Responsiveness beats planning
Communities beat audiences
Reputation beats branding
Sociality beats media-mentality
I’ll focus on the first one for now: responsiveness beats planning. The kernel of my argument is that the social web is pressuring organizations to accelerate all forms of communications from “batch” processing to real-time interaction. The result is a fundamentally different approach to how a marketing/communications organization needs to be structured and serviced.

Human beings have spent millennia communicating in real-time. The acceleration of technology is simply an effort to catch up to our zero-latency experience of being. Whenever given a choice, we will opt for a service that delivers response times as fast as our own nervous system. The technology and processes around us are nowhere close to catching up — yet wherever they do, we see incredible value creation. Any information processing technology that moves from batch to real-time experiences a quantum leap in value, especially for those who adopt it first. Consider the arbitrage opportunity in financial systems capable of receiving prices in real-time, real-time trading desks that place advertising based on current inventory and effectiveness, the efficiency of inventory management occurring in real-time across the supply chain and you get the idea. All of the systems that surround and support modern life are accelerating into real-time systems. Social is moving into real-time precisely because that is the speed at which human beings prefer to communicate, and social technologies that have accelerated closer to real-time are now shaping customer expectations.

What are the implications?

With the rise of real-time, responsive communications, marketing and comms are experiencing a massive acceleration in the traditional timeline needed to create branded content. This goes well beyond customer care — it is more like a dynamic content production capability that marketers need in order to sustain brand relationships.

Examples:

The Presidential debates: After Barack Obama’s first debate with Mitt Romney, a debate in which all sides realized Obama had turned in an incredibly poor showing, the Obama camp took just three hours to pour over the debate and edit together a commercial highlighting some of Romney’s more damning statements.
Oreo’s response to the Superbowl blackout was retweeted 15,000 times and received more than 20,000 likes within 24 hours. The graphic released during the blackout was “designed, captioned and approved within minutes,” thanks to members of 360i — the cookie company’s agency — gathered at a war room during the game.
The popularity of apps like SnapChat and Poke are creating time-limited content and offers based on immediacy.
This type of speed and responsiveness has less to do with strategy and planning and everything to do with logistics and coordination.

It calls on marketers to actually understand how organizations are structured, how governance needs to shift to enable more responsive organizations, how we staff our accounts to develop a drumbeat of meaningful content that engages, how we equip our clients to become digital publishers of real-time communications, and how we automate as many parts of the communications "supply chain" as possible.

In the next article, I will explore the design pattern that is rewiring business, "communities beat audiences."

Using mobile and web technology to enable citizens to have their say



Open Up! Using mobile and web technology to enable citizens to have their say

Much of my first three months at DFID were spent on the build up to and organisation of the OpenUp! Conference, an event organised between DFID and the Omidyar Network,  in association with Wired magazine. Coming relatively fresh to development I initially did not quite recognise the weight and scale of the event. Writing the profiles for speakers and glancing at the long list of attendees made me swiftly realise my naivety, and having met development professionals at conferences and the like since, I have come to realise that this really was no everyday affair.
‘Open Up!’ brought together almost 200 development and tech professionals from around the world to share best practise and discuss how citizens can have a say in the decisions that affect their lives. Mobile and web technology can be powerful tools in enabling governments to become accountable and transparent to their citizens. Technology can provide a cheap way to disseminate information to all citizens and provide them with a tangible way to engage, respond and take actions that can be seen by governments and those in power.
Innovative online platforms, such as Crowd Voice, enable activists to share information. (Picture: crowdvoice.org)
During the ‘show and tell’ section of the event I found one speaker particularly memorable. Mideast Youth, who are supported by Omidyar Network, presented their open source platform Crowd Voice to draw together a global community of voices of dissent and protest from around the world. I had never before heard of such platforms and they appear to create a safe haven for those who wish to challenge the decisions or actions of their governments. The online platform itself is worth a look, it’s interactive, engaging and is self-moderated by other members on the site as well as a small team from Mideast Youth. The platform is open source and is used by various other organisations which have adapted it to their needs – open source technology is revolutionising the way that the web works and is a model which is both transparent and innovative.  Mideast Youth’s other incredible projects include Mideast Tunes, a platform for underground musicians in the Middle East and North Africa who use music for social change, and Ahwaa.org, an open space to discuss LGBTQ issues in the Middle East. Having a limited experience of programming and website design myself I was really taken aback by the innovative and interactive way the platform works, something for fellow tech enthusiasts out there. They also have an incredibly cool iPad app, if you're lucky enough to own an iPad! This really did open my eyes to what’s out there in the international development world that, although often considered niche, can be used for great things.
Another memorable example for the day was Digital Green. Digital Green provides a YouTube and Facebook style network for farmers around the world to share lessons in agriculture called ‘Farmerbook’. The first questions asked by those watching the videos produced by farmers themselves and posted on the database tend to be about the individual – what village they are from and what family. This I found particularly interesting as I myself prefer to learn from real people than from the written word, I found trying to learn to knit from a complex series of pictures, numbers and cryptic text was much more of a hassle than watching a nice elderly lady on YouTube run me through it step-by-step.  Both these examples made me think more about technology as an enabler – it provides a toolkit of devices and methods that can be tailored to individual needs with endless possibilities. Harnessing these opportunities is something that we seem to only just be getting a grip on.

There are more specific examples where technology can be used as a practical tool to delivering aid objectives. Elections are a clear place for technology to triumph in enabling transparency. During the Nigerian elections, UK aid funded a programme that used SMS messaging to enable Nigerians to hold their government to account for a free and fair election. Observers were deployed to polling stations, reporting the voting results for each station via SMS messaging and comparing the vote tabulation with the officially announced results. Radar used a similar technique to report violence and challenges at polling stations across Sierra Leone in the November 2012 elections which they combined with a programme to train young journalists in mobile reporting in conjunction withLeonard Cheshire Disability. Radar is a great example of how technology can promote accountability mechanisms at the same time as helping achieve other development aims/goals, something that I’ve been discovering more of as I look into innovative ideas in more detail.

Clearly there has been great support for the ideas and messages coming from the Open Up! conference. The man behind the internet itself and founder of the World Wide Web Foundation , Tim Berners-Lee, fully endorsed the Open Up! euphoria in his blog post. DFID’s own Justine Greening announced the launch of the Making All Voices Countchallenge fund at the event which will provide $45 million to support innovation, scaling-up, and research that will deepen existing innovations and help harness new technologies to enable citizen engagement and government responsiveness – clearly stating DFIDs commitment to using technology to give voice.

Stephen Fry, Tim O Reilly, Rakesh Rajani and Ethan Zuckerman, tweeted throughout the day alongside a string of other big names. In total almost 5 million twitter accounts were reached with the #OpenUp12 hash tag which trended on Twitter in London, Nairobi, Lagos, Paris, Berlin, California, Boston and Washington during the day.
All in all I have concluded that for my first three months with DFID this has been a pretty fantastic experience. Despite not being able to tell my friends that I have met the man who invented the web I can still say that he, Stephen Fry and others talked about an event I took part in organising. Throwing in that I met the Queen of Jordan at the High Level Panel Meeting on the post-MDGs helps too, even if they still have no idea what post-MDGs are nor what international development really is. I will keep my fingers crossed for the invitation from Sir Tim.
Clearly these issues are at the forefront of development thinking both within the area of governance and as a cross-cutting theme and are a main priority for DFID. What’s important is that DFID is taking these issues seriously, having announced the Making All Voices Count fund and supporting many programmes such as Laptop Ladies (which I will discuss next time) as well as exploring opportunities for using technology in the humanitarian field.  There are too many examples to share here but I hope to shed some light next time on how my investigations into the use of technology for empowering and providing services for women have been even more rewarding and exciting.

Pingbacks, another federated web technology, dying


Pingbacks, another federated web technology, dying


Pingbacks are a technology from 2000′ era of blogging boom. Their role is to let publishers know when someone has linked to their articles from some other part of the web.

The idea is pretty simple – let’s have a protocol that allows content management systems to ping each other when published articles contain references to them. Then referenced systems can take that information and show it inside the article. Usually by displaying them as “look how popular my post is – here are the people who think it is cool!”.

Because it means publishing links to other people, authors are expected to manually approve those links. As you could imagine this quickly overwhelms a novice blogger. And it is unmanageable for more popular one. Especially since pingbacks are very popular SEO tactics to build links.

Big blogging platforms have many problems with pingbacks – not only is abuse by spammers confusing regular bloggers, quite some resources have to go in sending and receiving those pings. I was not surprised to read in January that Typepad is killing the pingback functionality. And I won’t be very surprised if WordPress and Blogger would follow suit.


A Spider’s Web by D. S.

So, another perfectly good idea is dying because of abuse, lack of innovation and primarily lack of interest. It seems that no one really has vested interest in fixing this technology. Naturally Facebook and Twitter have never really used pingbacks either. The players are more interested in building walled gardens where the information can be more easily controlled and aggregated, on top of that they can then provide better user experience and capture revenue. Along with RSS and Google Reader, pingbacks are another 2000′ era technology that is dying.

What is curious is that RSS and pingbacks  were invented to federate the publishing – to enable lots of small, independently run publishing sites. Is there something inherent in federated publishing technologies that makes them unsustainable? Similarly federated authentication and identification haven’t seen much of uptake.

The internet was built on federated, distributed services – web and e-mail being the most prominent ones. But taking a deeper look today we can see that behind the scenes de-federation is happening. Web serving is increasingly becoming the domain of the big guys – Google, Amazon, Facebook and you can add Heroku and some other big hosts. At the same time handling e-mail has become so complicated and costly that it’s not really a federated service any more. Gmail, Hotmail and Yahoo mail are dominating the space and cloud is eating even into the enterprise Exchange installations.

Demise of RSS and pingbacks should therefore make us question what is the future of the web we want!

[Technically there are three different technologies - refback, pingback and trackback that serve the same function. For the purpose of this article I called them all pingbacks.]

WEB TECHNOLOGY FOCUS GROUP MEETINGS


WEB TECHNOLOGY FOCUS GROUP MEETINGS



What?   The City of Hamilton is hosting two Focus Group Meetings to gather feedback from those who may have an interest related to the selection of web technology for the City’s new website.

Why?    This is your chance to give feedback and share your ideas about how the choice of the City’s web technology could enable future innovation and collaboration.

Who?    Interested citizens, local technology developers and community stakeholders who understand how the adoption of new web technology can enable innovation and benefit citizens.

When?  Monday, April 29th – 2:00 to 4:00 p.m. or 7:00 to 9:00 p.m.

While the focus group meetings are open any interested citizens, the afternoon session will be tailored to the Business/Education sectors and the evening session will be tailored to broader input from citizens and community members. The City’s consultant on the Web Technology Assessment will be facilitating the discussion, with support from City staff.

Where? City Hall – 71 Main Street West, 1st floor, Boardroom 193



To RSVP, or to ask questions, please contact:

Jennifer DiDomenico

City of Hamilton

Tel.: 905.546.2424 ext. 5596

How to Carbon-Date a Web Page


How to Carbon-Date a Web Page
If a Web page lacks a time stamp, how do you know when it was created? A new Web application could help.


Ever needed to know the age of a Web page only to discover that it lacks a time stamp saying when it was published?

If so, then the work of Hany SalahEldeen and Michael Nelson at Old Dominion University in Norfolk, Virginia, may be of interest. These guys have created a Web application called Carbon Date that works out the creation date of a page by searching for the earliest evidence of its existence.

The process is straightforward. Many Web pages end up being recorded in various ways soon after they are created. For example, it’s easy to check Bitly to see the first time that anybody shortened the URL in question, or to use Topsy to check the first date that anybody tweeted the URL. Then there is Memento, which reveals the first time the URL was recorded on a Web archive. Google can also reveal the first time the page was indexed, and even the last-modified HTTP response header of the page itself shows when it was last changed.

Each of these is straightforward to check by itself, but checking them all to find the earliest date is time-consuming.  Carbon Date automates the process.

SalahEldeen and Nelson say their new tool works reasonably well.  They tested it on a set of 1200 Web pages for which the creation date was already known. “We were able to estimate a creation date for 75.90 percent of the resources, with 32.78 percent having the correct value,” they say.

That’s not quite as accurate as a researcher or journalist might like, but it’s a start. If you wanto to test it yourself, SalahEldeen and Nelson say they’ve made Carbon Date available at http://cd.cs.odu.edu/cd/. (However, at the time of writing it did not appear to be working.)

Update 23 April:  link updated and now working. See comment below. The service gives the estimated date of creation of www.technologyreview.com as October 2001. The historians at Tech Review tell me they first started using the domain in February 2001, so that’s not far off.

SECURE FILE SYNC BETWEEN COMPUTERS USING P2P TECHNOLOGY


BitTorrent Sync, a free to use (but not open source) tool that can be used to automatically synchronize files between computers using the BitTorrent protocol, is now available for all as public alpha. The application runs on Linux, Windows, Mac OS X and NAS.


BitTorrent Sync Web Interface Linux

The tool, which has been in closed alpha for the past four months, is advertised as "a simple tool that applies p2p protocol for direct live folder sync with maximum security, network speed and storage capacity".

And indeed, BitTorrent Sync is a great way of syncing and sharing files between computers, even files that you usually wouldn't trust with cloud sync tools such as Dropbox: besides the files being transferred directly between the users (so your files don't end up on some cloud server), the connection is encrypted with a AES cypher and a 256-bit key created on the base of your Secret, a random 20 byte or more string. For increased security, there's also an option to generate a Secret that expires after a day.

Basically, this peer-to-peer (p2p) sync tool can be compared with cloud sync services such as Dropbox or Ubuntu One, but no server is involved which also means that the sync can be faster than such services and there are no space restrictions (except your HDD). This also means the computers need to be online for the sync to work, obviously.


How to use BitTorrent Sync on Linux


Arch Linux users can install BitTorrent Sync via AUR

To use BitTorrent Sync (or "BtSync") on Linux, download and extract the binary for your architecture (Ubuntu users: use the "Linux i386" or "Linux x64" links on the left; don't download the "glib 2.3" binaries), extract it and to run it, simply type the following in a terminal (assuming you've extracted the "btsync" binary in your home folder):
cd
./btsync

To see a list of available options, type:
./btsync --help

And if you want to stop BitTorrent Sync, use:
killall btsync

BitTorrent Sync doesn't come with a GUI for Linux, but it can be configured through a web interface so once you've started BtSync like we've explained above, open a web browser and enter the following URL: http://localhost:8888/gui/

Now let's add a folder to sync. To do this, click "Add folder", then browse for a folder you want to sync and also click "Generate" to generate a Secret for this folder. Then click "Add" to add the folder:
BitTorrent Sync Web Interface Linux

If you want to connect to other devices / synchronize a folder from a remote computer on your machine, follow the same steps as above, but instead of clicking "Generate" for the Secret, paste the Secret generated for the folder from the remote computer.

On Linux, most of the settings aren't available in the web interface. Instead, you can run:
./btsync --dump-sample-config > sync.conf

Which will create a new "sync.conf" file with a sample configuration file. Modify the settings in this file to suit your needs and then run BtSync using the following command (which will make BtSync use the newly created configuration file):
./btsync --config sync.conf

For more on how to use BitTorrent Sync on Linux but also on Windows and Mac OS X, see the User Guide PDF file.

Download BitTorrent Sync for Linux, Windows, Mac OS X and NAS.

ASP.NET Web API: CORS support and Attribute Based Routing Improvements


ASP.NET Web API: CORS support and Attribute Based Routing Improvements

seen a huge adoption of ASP.NET Web API since its initial release.  In February we shipped the ASP.NET and Web Tools 2012.2 Update – which added a number of additional enhancements to both Web API and the other components of ASP.NET. 

The ASP.NET Team has been hard at work on developing the next set of features (lots of cool stuff coming).  One of the great things about this work has been how the team has used theopen source development process – which we announced we were adopting last spring - to collaborate even more closely with the community to both validate the features early, as well as enable developers in the community to directly contribute to the development of them.

Below are some updates on two of the great features coming to ASP.NET Web API – which were developed and contributed by ASP.NET MVP Brock Allen and Tim McCall (ofattributerouting.net fame):

CORS support for ASP.NET Web API

Cross-origin resource sharing (CORS) is a W3C standard that allows web pages to make AJAX requests to a different domain. This standard relaxes the same-origin policy implemented in web browsers that restricts calls to the domain of the resource that makes the call. The CORS specification defines how the browser and server interact to make cross-origin calls.

The following image shows the ASP.NET Web API Test Tool (running on http://xyz123.azurewebsites.net/) making a cross domain call to the Contoso domain. When you click Send, a cross-origin request is made. Because the Contoso site is not configured to support CORS, an error dialog is displayed.

clip_image002[4]
The CORS error appears on the Console tab of the IE F12 tools.
clip_image004[4]
For security reasons, the web browser doesn’t allow calls from the azurewebsites domain to the Contoso domain. With the new ASP.NET Web API CORS framework, Contoso.com can be configured to send the correct CORS headers so the browser will accept cross-origin calls.
clip_image005[4]
MVP Brock Allen contributed his CORS source to the ASP.NET Webstack repository. Brock worked with Yao Huang Lin (a developer on the ASP.NET team), to refine and iterate the design and then to get it pulled into the Webstack repository. Brock AllenDan Roth, and Yao discuss Brock’s CORS contribution in this Channel 9 video.
The CORS support for ASP.NET Web API page shows how to get started with this new feature.

Attribute-Based Routing in ASP.NET Web API

We recently published in the ASP.NET Web API roadmap our intention to support attribute- based routing in ASP.NET Web API. Route attributes bring the URL definition closer to the code that runs for that particular URL, making it easier to understand which URL must be called for a particular block of code and simplifying many common routing scenarios.
For example, let’s say you want to define a Web API that has the standard set of HTTP actions (GET, POST, PUT, DELETE, and so on) but you also want to have an additional custom action, such as Approve. Instead of adding another route to the global route table for the Approve action, you can instead just attribute the action directly:
    public class OrdersController : ApiController
    {
        public IEnumerable<Order> GetOrders() {…}
        public Order GetOrder(int id) {…}
        public Order Post(Order order) {…}
        [HttpPost("orders/{id}/approve")]
        public Order Approve(int id) {…}
    }
An extended route template syntax makes it simple to specify default values and constraints for route values. For example, you can now easily create two actions that are called based on parameter type. In the following People controller, the id parameter of the GetByID action takes only int values. The GetByName action method contains a default name of “Nick”.
    public class PeopleController : ApiController
    {
        [HttpGet("{name=Nick}")]
        public string GetByName(string name) {…}

        [HttpGet("{id:int}")]
        public string GetById(int id) {…}
    }
You can also define common route prefixes for your web APIs. For example, you can use route prefixes to set up a resource hierarchy:
    [RoutePrefix("movies")]
    [RoutePrefix("actors/{actorId}/movies")]
    [RoutePrefix("directors/{directorId}/movies")]
    public class MoviesController : ApiController
    {
        public IEnumerable<Movie> GetMovies() {…}
        public IEnumerable<Movie> GetMoviesByActor(int actorId) {…}
        public IEnumerable<Movie> GetMoviesByDirector(int directorId) {…}
    }
Or, you can use route prefixes to handle multiple versions of your web API:
    [RoutePrefix("api/v1/customers")]
    public class CustomersV1Controller : ApiController {…}
   
    [RoutePrefix("api/v2/customers")]
    public class CustomersV2Controller : ApiController {…}
Similar to the new CORS support in ASP.NET Web API, the new support for attribute-based routing is largely a contribution from the community. We are working closely with Tim McCall of attributerouting.net fame to bring many of the features of his AttributeRouting project directly into ASP.NET Web API.
It’s really exciting to see how these collaborations across the ASP.NET Team and the community are helping to move the ASP.NET platform forward!
Hope this helps,
Scott

Search