To help keep you up-to-date with the latest news and ideas from the industry, we have compiled the latest articles from industry leaders and corporate blogs. New content is pulled hourly from each blog's RSS feed. The article links will take you directly to the related blog.
Bing Product Ads give you the opportunity to showcase your products in an engaging ad format by including images, promotional text, pricing and your company name. It also lets you reach 31 million retail searches that don’t use Google in the U.S.
Online retailers should definitely consider Bing Product Ads since the Bing/Yahoo network accounts for 117 million retail clicks a month, or 22% of all U.S. retail paid clicks, as stated by comScore in June 2013. Those aren’t numbers to take lightly, especially if you are a small business just making your way in the very competitive world of retail.
Bing is trying very hard to get your business with one of their key features which allows advertisers to import directly from Google’s PLAs. Bing Ads had been beta testing the format for several months, across hundreds of advertisers with 140 million different offers.
To use the Bing Product Ads, you must offer retail products and have at least one SKU, have a U.S. billing address and sell products online with a secure online checkout option.
Marketing experts say Microsoft was smart to take a page from Google’s Product Listing Ads playbook, which also allows marketers to run image-dense ads along with other search listings. “In this case Bing and Microsoft have learned that it’s best to watch what Google launches successfully and be a fast follower,” says Kevin Lee, CEO of online marketing firm Didit.
“The Bing Product Ads are nearly identical to the Google product,” Lee says. “That’s good. It makes it easy for both retailers and their agencies to go live.”
David Pann, general manager of the search network at Microsoft, writes in a blog posted on the Bing Ads website that Bing Product Ads were designed for easy deployment. “You don’t have to write ad copy and develop keywords for each campaign,” he writes. “Instead, Product Ads builds upon work that you’ve already done, automatically serving the appropriate image, price and brand name.”
One difference between Bing Product Ads and Google Product Listing Ads is that the Bing ads run on the right side of the search results page, where they appear separate from natural search listings. By comparison, Google Product Listing Ads appear in the center of the search results page, where until recently they appeared among natural search results.
Second Quarter Search Spend Results
Last month the second quarter reports came out from the large search marketing firms and they pointed to continued growth in paid search investment in the US and globally. The reports also underscore the increasing influence of mobile and image-based product ads on growth.
According to RKG’s quarterly report, Ad spend, or money set aside for set aside for advertising, rose 23 percent year-over-year among RKG’s retail-heavy client base. Both Google and Bing saw spend increases.
Ad spend on Bing rose 19 percent overall. Bing non-brand spend also increased by 19 percent, nearing Q4 levels, and click volume rose by 26 percent. With mobile traffic share increasing, CPCs fell 6 percent year-over-year.
Spending on image-based product ads increased by 72 percent year-over-year in Q2. Click share rose by 28 percent and CPCs bumped up 35 percent. In contrast, text ad spending rose by just 11 percent year-over-year.
Bing Product Ads, which launched in beta last fall, and became available to all advertisers at the end of March, drove 8 percent of Bing Ads non-brand clicks and spend. Bing Product Ads generated 48 percent higher revenue per click (RPC) than non-brand text ads, far outpacing Google’s 17 percent spread between PLAs and non-brand text as RPC. CPCs were 8 percent lower and CTRs were 19 percent better.
The Heartbleed bug is a recently discovered security vulnerability which puts users’ passwords at risk on several popular websites. It is an extremely serious problem that has single handedly turned the whole internet onend.
Heartbleed is a security vulnerability in OpenSSL software that lets a hacker access the memory of data servers. According to the Internet research firm Netcraft, 500,000 websites can potentially be affected. This means that a user’s sensitive personal data, including usernames, passwords and credit card information is at risk of being potentially stolen. This also means that companies are in danger of having their server’s digital keys, used to encrypt communications stolen along with confidential internal documents.
Secure Sockets Layer (SSL), also known by its current name, Transport Layer Security, or TLS. It is the most basic means of encrypting information on the Web, and it mitigates the potential of someone eavesdropping on you as you browse the internet. The “https” in the URL of SSL-enabled sites like Gmail, rather than simply “http” is an indication of SSL.
In regards to open-source software for SSL implementation, the versions with vulnerability are 1.0.1 through 1.0.1f. OpenSSL is also part of the Linux operating system, and as a component of Apache and Nginx, two very widely used programs for running Websites. In other words, it is used extensively across the Web.
Discovery of the bug
Credit for the discovery of the Heartbleed Bug is given to the security firm Codenomicon and Google researcher Neel Mehta, who both found the bug independently, however, on the same day.
Mehta donated the $15,000 bounty he was awarded for helping find the bug to the Freedom of the Press Foundation’s campaign for the development of encryption tools for journalists to use when communicating with sources. Mehta has declined press interviews, but when asked for a comment, Google said, “The security of our user’s information is a top priority. We proactively look for vulnerabilities and encourage others to report them precisely so that we are able to fix them before they are exploited.”
How it got its name
According to Vocativ, the term “Heartbleed” was coined by Ossi Herrala, a systems administrator at Codenomicon. It’s got a nicer ring to it than its technical name, CVE-2014-0160, which is the name for the line of code that contained the bug.
Heartbleed is a play on words referring to an extension on OpenSSL called “heartbeat.” The protocol is used to keep connections open, even when data isn’t being shared between those connections. David Chartier, chief executive of Codenomicon told Vocativ, “Herrala thought it was fitting to call it Heartbleed because it was bleeding out the important information from the memory.”
The name was specifically chosen to be catchy because the team at Codenomicon wanted something press friendly that could catch on quickly, to warn more people of the important bug. Soon after they named the bug, they bought the domain Heartbleed.com to educate people about the very destructive bug.
Some sites aren’t affected
Although OpenSSL is very popular, there are other SSL/TLS options. In addition, some websites use an earlier, unaffected version, and some didn’t enable the “heartbeat” feature that was central to the vulnerability.
While it doesn’t solve the problem, what mitigates the scope of the potential damage is the implementation of perfect forward secrecy, or PFS, a practice that makes sure encryption keys have a very short shelf life, and are not used forever. What this means is, if an attacker did get an encryption key out of a server’s memory, the attacker wouldn’t be able to decode all secure traffic from that server because key use is very limited. While some tech, giants, like Google and Facebook, have started to support PFS, not every company does.
How the bug works
The vulnerability allows a hacker to access up to 64 kilobytes of server memory, but perform the attack again and again to get a massive amount of information. This means an attacker can, not only, receive only usernames and passwords, but also “cookie” data that Web servers and browsers use to track individuals and ease login. According to the Electronic Frontier Foundation, committing the attack repeatedly could yield more serious information, including a site’s private SSL key, used to encrypt traffic. With this key, someone could run a fake version of a website and use it to steal all other forms of information, like credit card numbers or private messages.
Changing your password
For many websites, changing your password will help protect your important information from attack. However, wait until you get confirmation from the website operator that the bug has been patched. It’s a natural reaction to want to change all of your passwords immediately, but if the website’s bug has not been fixed yet, making the change could be useless, you would only be giving an attacker your new password.
Checking to see if a website is affected
A few companies and developers have created testing sites to check which websites are vulnerable or safe. Two of the better sites are by LastPass, a company that makes password management software, and Qualys, a security firm. While these test sites are a good preliminary check, continue to proceed with caution, even if the site gives you an all-clear indication. If you’re given a red flag, however, avoid the site altogether.
Who came up with the bug
According to the Guardian, the programmer who wrote the glitchy code was Robin Seggelmann, who worked for the OpenSSL project while getting his Ph.D. studies from 2008 to 2012. Adding to the drama of the situation, he submitted the code at 11:59 p.m. on New Year’s Eve 2011, though he claims the timing has nothing to do with the bug. “I am responsible for the error,” Seggelmann said. “Because I wrote the code and missed the necessary validation by an oversight.”
As an open-source project, it’s hard to place the blame completely on one person. As Zulfikar Ramzan, chief technology officer of cloud security startup Elastica, explained to The New York Times, there’s so much complex code that people had been writing, and the particular protocol Heartbeat did not get enough scrutiny. “Heartbeat is not the main part of SSL. It’s just one additional feature within SSL,” he said. “So it’s conceivable that nobody looked at that code as carefully because it was not part of the main line.”
Is my bank account in danger?
Most banks don’t use OpenSSL, but instead use proprietary encryption software. But if you’re unsure, contact your bank directly for confirmation that the website is secure. Still, John Miller, security research manager for security and compliance firm TrustWave, suggests keeping a close eye on financial statements for the next few days to make sure there are no unfamiliar charges.
Information travels along familiar routes in most organizations. Proprietary information is stored in databases and analyzed in reports, then goes up the chain of management. Information also originates externally, gathered from public sources, collected from the internet or purchased from information suppliers.
But the normal methods used to retrieve data are changing, the physical world itself is becoming it’s own information database. In what’s called the Internet of Things (IoT), sensors and actuators are embedded in physical objects, from roads to pacemakers, and these physical objects are linked through wired and wireless networks. Often using the same Internet Protocol (IP) that connects the internet. These networks pour out huge volumes of data that flow to computers for analysis. When objects can both sense the environment and communicate, they then become tools for understanding complexity and respond to it quickly. The revolutionary part of all this is that these physical information systems are now beginning to be deployed, and some of them even work mainly without human intervention.
Widespread adoption of the Internet of Things will take time, but its advancing faster due to improvements in underlying technologies. Advances in wireless networking technology and the greater standardization of communications protocols make it possible to collect data from these sensors almost anywhere, at any time. Smaller silicon chips used for these applications are gaining new capabilities, while costs, following the pattern of Moore’s Law, are falling. Huge increases in storage and computing power, some available through cloud computing, make the analysis possible on a massive scale at a declining cost.
With new networks linking data from products, company assets or the operating environment, generate better information and analysis. This can enhance decision making significantly. Some organizations are starting to deploy these applications in targeted areas, while more cutting edge and intensive uses are still in the conceptual or experimental stages.
When products are embedded with sensors, companies can track the movements of these products and even monitor interactions with them. Business models can be modified to take advantage of this behavioral data. For example, some insurance companies are offering to install location sensors in customers’ cars. This allows the companies to base the price of policies on how a car is driven, as well as where it travels. Insurance prices can then be customized to the actual risks of operating a vehicle rather than based on proxies such as a driver’s age, gender, or place of residence.
In area of business-to-business, one popular use of the Internet of Things involves using sensors to track RFID (radio-frequency identification) tags placed on products moving through supply chains, which greatly improves inventory management while reducing working capital and logistics costs. The range of possible uses for tacking is expanding.
In the aviation industry, sensor technologies are creating new business models. Manufacturers of jet engines retain ownership of their products while charging airlines for the cost for the amount of thrust used. Airplane manufacturers are building airframes with networked sensors that send continuous data on product wear and tear to their computers, allowing for proactive maintenance and reducing unplanned downtime.
Enhanced awareness of situations
Data from several sensors, placed in infrastructure (like roads and buildings) and reporting on environmental conditions (soil moisture, ocean currents or weather), can give decision makers a heightened awareness of real-time events, particularly when the sensors are used with advanced display or visualization technologies.
Security personnel, for instance, can use sensor networks that combine video, audio and vibration detectors to spot unauthorized individuals who enter restricted areas. Some advanced security systems already use some of these technologies. However, more powerful and further reaching applications are being created as sensors become smaller and more powerful.
Software systems are becoming more adept at analyzing and displaying captured information. Logistics managers for airlines and trucking lines are already tapping some early capabilities to get up-to-the-second knowledge of weather conditions, traffic patterns and vehicle locations. These managers are able to increase their ability to make constant routing adjustments that reduce congestion costs and increase a network’s effective capacity. In other applications, law-enforcement officers can get instantaneous data from sonic sensors that are able to pinpoint the location of gunfire.
Automation and control
When you make data the foundation for automation and control, you convert the data and analysis collected through the Internet of Things into instructions that returns back through the networks to actuators that in turn modify processes. Closing the loop from data to automated applications can raise productivity, as systems that adjust automatically to complex situations make many human interventions unnecessary. Early adopters are creating relatively basic applications that provide a fairly immediate payoff. Advanced automated systems will be adopted by organizations as these technologies further develop.
The Internet of Things has unlimited possibilities. However, business policies and technical challenges must be overcome before these systems are widely adopted. Early adopters will need to prove that the new sensor-driven business models create superior value. Industry groups and government regulators should study rules on data privacy and data security, particularly for uses that touch on sensitive consumer information.
The legal liability for the bad decisions of automated systems will have to be established by governments, companies and risk analysts in tandem with insurers. The technology costs of sensors and actuators must fall to levels that will spark widespread use.
Networking technologies and the standards that support them must evolve to the point where data can flow freely among sensors, computers and actuators. The software used to aggregate and analyze data, as well as graphic display techniques, must improve to the point where huge volumes of data can be absorbed by human decision makers or synthesized to guide automated systems more appropriately.
Within companies, big changes in information patterns will have implications for organizational structures, as well as for the way decisions are made, operations are managed and processes are conceived. Product development, for example, will need to reflect far greater possibilities for capturing and analyzing information.
Email marketers often focus only on the number of opens and clicks that come from email subscribers. They tend to forget about the conversions which are, of course, the most important. Purchases, registrations, reads, downloads, etc. will increase only if you have them set as a goal. It will be much easier to achieve the outcome you want, if you know exactly what you want your email subscribers to do after receiving your message, with the appropriate subject lines, call to action buttons, images, content and other email elements.
You should spend 80% of your time perfecting your subject line, call to action and headlines, and 20% writing copy. You need to create subject lines and call to action buttons with your email subscriber in mind.
Be very specific, relevant and useful to your customer. In this case, shorter is not always better, so replace the usual one word call to action with something more specific in order to get your email subscribers to act on your emails.
• Be visually unique, for example, make your subject line stand out visually and by trying square brackets, symbols, etc.
• Use digits, action words and timely topics, for example, use an end date for a campaign to motivate email subscribers to react to the message right here and now.
• Use a call to action by asking a question.
• Make the call to action appropriate to where the email subscribers are at in the buying cycle.
Use the golden real estate effectively
When creating your email design, use the golden real estate wisely. The golden real estate is the upper left hand corner of a newsletter, because as we read in an F-shaped pattern, this is the place where attention is paid first of all. So don’t waste this space with images or logos. Leave it for text you directly call on your email recipients to act.
In some email programs, you are able to see the first phrase of a newsletter even before opening it. Usually the phrases in this area are “View this email in a browser”, “Show remote content” or other information that is at the top of the email.
However, you can use this pre-header for marketing reasons. If before opening the newsletter you saw “Hi, your name”, wouldn’t you be more likely to open it? Any personalization increases the likelihood your email subscribers will react to your emails.
Keep it simple
Don’t give your email recipients too many choices. The simpler a message, the easier it is for email subscribers to take action. Choose to have only one call to action instead of several. Also in regards to keeping it simple, use the “rule of 3”, which states you shouldn’t use more than three bullet points within email content. While the “rule of 2” says the two most important content elements are a heading and the first paragraph, so in this case, less is more.
After you have established your targets and how they like to consume information, you need to now deliver the right kind of information to them. First and foremost, stop using emails to sell and start using them to teach. People love to learn and want content and information that is of value to them.
However, don’t give everything away in the email, draw them back to your site. Give them a small taste of the information that makes them want more, then direct them to a landing page.
Constantly test and optimize
Less than half of online retailers perform some sort of optimization in their email marketing campaigns.
It is best to set up goals, pay attention to details, benefit from any opportunity to effectively analyze and optimize campaigns. These are tips that every email marketer should follow closely to get email subscribers to act on your emails in the manner you need them to.
Email marketing can be a powerful tool. It doesn’t have to just be something you check off a list. When done right, you’ll notice more than just a higher open rate. You’ll see higher lead acquisition and conversion rates as well.
An affiliate marketing campaign is where other website owners place ads for your business on their websites. These website owners are known as affiliates. Its important to understand how this type of marketing works. Generally, the affiliates are given a code for your banner ad to place on their website and given free reign to promote their own website as they see fit. While promoting their website they are also attracting attention to your website because of the banner ad that directs visitors to your site.
One of the best features of affiliate marketing is the affiliate is only compensated when they produce the desired result. This means the business owner is not obligated to pay the affiliate unless the affiliate is successful. That success may be generating traffic to the website, resulting in a sale or even resulting in the user registering on your website or filling out a survey.
The compensation for affiliates is usually based on cost per click, cost per lead or cost per sale. In regards to the cost per click and the cost per lead, the affiliate is usually paid a flat rate every time a user either clicks through the banner ad on the website or performs a specific action after clicking through the ad. A cost per sale may result in the affiliate being compensated either a flat fee or a percentage of the sale depending on the agreement between the business owner and the affiliate.
The most beneficial use of affiliate marketing is to partner with affiliates who have a proven track record of promoting the businesses which they support. Most affiliate programs are open to anyone with a website. However, it is much more worthwhile to find affiliates who are able to generate a large amount of traffic to their own website. This is very important because the more visitors the affiliate receives each month, the more likely your website will receive visitors who click on the affiliate’s banner ad.
Here are some key benefits to starting an affiliate marketing campaign:
Automated advertising campaign
Affiliate tracking software is the perfect marketing tool because you literally implement an automated advertising campaign that will promote your product or service 24 hours a day, 7 days a week.
Affiliate registrations and sales reports are processed automatically. Visitors and sales are tracked automatically.
You don’t pay unless you make money
The biggest benefit to an affiliate marketing campaign is that you don’t pay anything until you get the results you want. Your affiliate will market your site and you won’t have to pay for the advertising until you make a sale or receive the desired effect.
If you want to run a test campaign to gain the pulse of your customer base. You can place your campaign in various places on the internet and measure the results. If a certain campaign isn’t working you can then tweak it for free or for very little, instead of having to pay the full price for a advertising campaign, only to find it isn’t working.
Viral marketing strategy
Viral marketing is very powerful. If you start a two-tier program, your affiliates earn money on the affiliates that sign up under them. This will encourage them to recruit more affiliates, which creates a viral marketing strategy with unlimited growth potential.
Better search engine placement
Link popularity plays a big role in the ranking of your site on the major search engines. The more sites that direct traffic to our domain, the higher your position will be. There is no better way to get hundreds of sites to link to yours than by starting your own affiliate program.
Tips on how to start a successful affiliate marketing campaign:
Don’t get fancy
In successful affiliate marketing, initial groundwork needs to be done to achieve long term success. Learning how it all works is your first step. Start by doing as much research as you can on affiliate marketing and then start out slow and simple until you find what works and you have a better understanding of the process.
Promote what you believe in
Its a waste of time, energy and money promoting products or services that you don’t believe are great, and worst, aren’t relevant to your business or audiences. It is a better idea to choose products that you have experience with yourself, or ones you could see benefiting yourself or your business.
Understand the rewards
Even though the level of commission that is offered will usually be what attracts you to a particular program, it’s just as important to keep in mind the likelihood of that product converting on your page.
For example, if product A offers $100 but has a 1% chance of converting and product B offers $25 but has a 5% chance of converting, then product B is the logical choice.
Content is king
The more visitors you have to your site, the more chances you have of making an affiliate referral to the merchant’s product or service. If your site carries a lot of constantly updated, interesting and up to date content, this will have a positive effect on the amount of traffic flowing to your site.
Reviewing the performance of the ads you’re hosting on your site is an important part of being successful in your campaign. You should review the performance of your affiliate programs every month, and then look at the ads that are performing well and those that aren’t. Then make adjustments because its possible that the ads could be better placed, depending on the audience that tend to visit certain pages on your site.
Affiliate marketing is one of the best online earning methods and with hard work, you can start to see a return with very little investment.
International SEO involves getting your site ranked for internationally competitive keywords, which drives traffic from around the world to your website. The more traffic from different parts of the world that your website receives, the more brand awareness, on a global level, your company will receive.
International search engine optimization doesn’t confine your business to its geographic location – it opens up all opportunities possible for growth, expansion and brand awareness. Launching an aggressive international SEO campaign involves a variety of different techniques.
SEO Fundamentals Remain the Same
The good news about launching an international SEO campaign is that the tactics you use for a domestic campaign are still the same. There are, however, additional factors to the decisions you make and the approach you take with a long-term strategy. Here are some of the most important considerations to address when launching your global SEO campaign:
Keywords are considered convenience words, not really traditional words. They are created to help people search and then respond to search marketers. Search phrases in English can have a completely different meaning in another language, and vice versa.
The solution to this is to recreate the keywords in the target language exactly the same way you would in English. To do this you can use a native-speaker of the target language, who is also trained in search marketing.
Consideration of How You’ll Manage Content
Not every culture is the same, obviously, this makes the transition from one language to the next a bit of a challenge if not done properly. The best way to handle this would be to build your English content with localization or translation in mind. The copywriter should create content that sticks to the facts and doesn’t include popular jokes or cultural references that a translator will be unable to translate.
Fresh copywriting in each new language will be significantly more expensive than using translation. However, a good option is to also mix fresh copywriting on a particular local subject which warrants it, and using localization for the rest. The prefered choice would be to work with an international search marketing company which can localize and optimize at the same time.
Optimization of International Web Structure
The Web structure of each international version is a key element that should be analyzed independently. You need to make sure that the structure isn’t only relevant, featuring the name of the URLs in the right language and with descriptive names, but also that its not overly complex and deep. For example, don’t add unnecessary directories. It is important to have a unique subdirectory, subdomain or a specific ccTLD (country code top-level domain) to show each of the international versions.
Crawling, Indexing and Alignment Issues
You need to make sure that your site is crawled effectively, indexed and correctly ranked in the the correct international search result version and for the relevant language queries.
You need to check to make sure you aren’t blocking some of your language or country versions whether with robots.txt, or by showing the content through script that isn’t able to be indexed, or that you’re not always automatically redirecting search crawlers to only one version.
Correct Usage of HreflangAnnotation
The lack of use or misuse of the hreflang annotations is one of the potential causes for an international search results or traffic misalignment. This includes the usage of non-existing language, country code or not cross-referring with other pages featuring the same content for other countries or in other languages. It is crucial to make sure of the correct usage.
Use the correct webmaster tools geolocation settings
One thing that is commonly either used incorrectly, or not at all is to verify the Geographic Target in Google Webmaster Tools if you are featuring sub-directories or sub-domains and not ccTLDs for your country targeted sites.
Linking International Websites
You need to be careful when cross-linking different international websites, even though you want to make sure that your users and search bots can correctly access all the different international Web versions. If there are too many, this might be seen as unnatural in case you’re not working with ccTLDs but gTLDs (generic top-level domains) due to some type of restriction.
Additionally, if there is a point when you’re not managing the SEO processes for the other international versions and some of them get penalized, the cross-linking might end-up hurting the site you’re focusing on too. This is why its important that you also monitor the cross-linking to minimize the risk as much as possible or even, just to make sure that search bots discover your different international Web versions but without overlooking the crawling of your internal pages.
Even if you aren’t focused on an international SEO process, with these tips you can make sure that the internationally targeted Web versions aren’t going to negatively affect your current SEO project.
There are several SEO tips and tricks which can help you optimize your site, but one of the most underestimated of these is sitemaps. Sitemaps are just like the name implies, a map of your site. On one page you show the structure of your site, its sections, the links between them, etc.
Sitemaps make navigating your site easier and having an updated sitemap makes it easier for users and more effective for search engines to index your site. Sitemaps are an important means of communication with with search engines. While robots.txt tells the search engines which parts of your site to exclude from indexing, a sitemap tells the search engines where you want them to go.
Sitemaps aren’t a new concept, they have always been a part of the best Web design practices. However, with the adoption of sitemaps by search engines, now they have become even more important. It is necessary to note that if you plan to use sitemaps for an SEO strategy, you can’t only go with the conventional format if you want it to be picked up by Google, which uses an XML format.
There are two popular versions of a sitemap. An XML Sitemap is a structured format that is more targeted for the search engines because it tells them about the pages on the site, their relative importance to each other and how often they are updated.
HTML Sitemaps are designed for the site’s user, to help them find content on the page, and they don’t include each and every subpage. This helps visitors and search engine bots find pages on the site.
Google introduced Google Sitemaps so web developers can publish lists of links from across their sites. The basic premise is that some sites have a large number of dynamic pages that are only available through the use of forms and user entries. The Sitemap files contain URLs to these pages so that web crawlers can find them. Bing, Google, Yahoo and Ask now jointly support the Sitemaps protocol.
Since all four of the major search engines use the same protocol, having a Sitemap makes sure they receive the most updated page information. Sitemaps aren’t a guarantee that all your links will be crawled, and being crawled doesn’t guarantee indexing. However, a Sitemap is still the best insurance for getting a search engine to learn about your entire site. Google Webmaster Tools allows a website owner to upload a sitemap that Google will crawl, or he can accomplish the same thing with the robots.txt file.
XML Sitemaps have replaced the older method of “submitting to search engines” by filling out a form on the search engine’s submission page. Now web developers submit a Sitemap directly, or wait for search engines to find it.
XML is also much more precise that HTML coding. Errors are not tolerated, and so syntax must be exact.
For new sites, or sites with a lot of new or recently updated pages, using a Sitemap can be crucial. Even though you can operate your site without a Sitemap, they are becoming the standard in both navigation and more importantly SEO. Having a well organized Sitemap is becoming one of the tools to get your site seen.
At the end of last year, Google announced their new Shopping Campaigns, which is a new campaign structure for product listing ads that offers a better way to manage the unique challenges of scaling Google Product Listing Ads (PLA). Shopping campaigns, which has several retail-specific features, presents a new phase in the development of Google’s feed-driven shopping ads.
The new structure will present cleaner, more intuitive methods of building out campaigns across catalogs of all sizes for many advertisers. For experienced advertisers with sophisticated campaign structures already in place, some features may be limiting. If the advertiser builds the campaign correctly, Google Shopping Campaigns will afford more discoverable views of performance by individual products, brands and categories.
Custom Labels and Product Groups
Currently, PLA campaigns use “product targets” to organize inventory, that specify which products in the feed should trigger PLA campaigns to appear for related searches. Shopping Campaigns will use product groups instead of product targets. This will allow advertisers to be more precise and have more control with their targeting.
Using groups gives advertisers the ability to segment their products using any of the attributes in the product feed including ID, product type, brand, product category and condition, as well as up to five custom labels. Custom labels allow advertisers to create even more precise groupings of their products, such as breaking out top performing or on-sale products. Custom labels and additional product details will be available in AdWords to tag and organize specific product traits.
Advertisers will then be able to use these groups to subdivide their products up to five times within Shopping campaigns, creating highly-specific product groupings. All products that aren’t placed in these groupings will be organized into an “everything else” grouping.
Shopping campaigns will also allow the ability to prioritize specific products or segments within the campaigns without having to make bidding adjustments or the need to adjust negative keyword strategies. This will be very helpful during promotional periods.
While PLA reporting capabilities were somewhat limited in regards of precision, Shopping campaigns will also allow performance visibility by individual item ID/SKU, as well as product attributes or custom labels. Even if the advertiser has an “all products” campaign catch-all, Shopping Campaigns will have reporting on product level metrics.
Shopping campaigns offer more robust competitive insights than PLA campaigns, with Impression Share and CTR/CPC performance benchmarking data at the product group level.
Impression Share data will enable improved budget and bidding strategies by showing advertisers how often they are showing in auctions for terms related to their products. Google is also releasing a bid simulator feature, to offer bidding and impression volume predictions, for bid changes against competitors who are competing in the same auctions.
Google is in the process of working with agencies and search management platforms to add support for Shopping campaigns. While timing has not been confirmed, integrations are likely to occur in the beginning of summer. Since these tools aren’t yet available, advanced campaigns would need to be constructed manually, or through the AdWords API.
With API information just released in March, it will take time for bidding platforms to develop their offerings to support integration with shopping campaigns.
Shopping campaigns allow retailers to add a promotional message to all of the products within a particular ad group.
Shopping campaigns are a noticeable improvement over traditional PLA campaigns because of easier, more intuitive management of product groupings and exciting new reporting capabilities.
With the simplicity of managing Shopping campaigns, it is expected that more advertisers will take advantage of PLAs. A retail-centric experience for campaign management may be more attractive to small and medium business advertisers who may not have robust product feed and campaign management capabilities. As new potential advertisers enter the market, there is a possibility of CPC increases due to new competitor volume.
Despite the threat of rising CPCs, advertisers will have new layers of data in their toolbox for granular SKU level optimizations and on-demand competitive knowledge for more informed bidding and budgeting decisions.
Google is also anticipating building more tools and features into the Shopping Campaigns to further improve PLA management.
The Industry Buzz section is divided into three major sections, which is then subdivided into smaller sections.
Corporate Blogs which include official blogs from web hosts, registrars, search engines and other related sites.
Magazines & Blogs include interesting websites related to the hosting industry, but not necessarily from official company blogs.
Industry Leaders include personal blogs from important industry leaders, such as employees from Google and WordPress. These blogs sometimes include insights on how industry leaders think, but also may contain topics not related to hosting.