Want more people to watch, share, and comment on your live videos? Looking for tips on improving the quality of viewer engagement? To explore how to get more engagement with Facebook Live video, I interview Stephanie Liu. More About This Show The Social Media Marketing podcast is designed to help busy marketers, business owners, and [...]
The post How to Get More Engagement With Facebook Live appeared first on Social Media Examiner.
HOUSTON, TX – cPanel & WHM® Version 74 is the third release of 2018 and includes improvements in many of cPanel & WHM’s most popular features.
Git Version Control allows users to easily create and manage Git repositories on their cPanel hosting account. Version 74 now includes the ability to automatically deploy sites and applications from a repository, making it even easier to host applications on cPanel servers.
The development team has been hard at work over the last year to improve the speed and quality of cPanel’s Automated Backups. cPanel users benefit from up to 60 percent faster backups and restores. The File and Directory Restore interface for server administrators and hosting users, along with new and improved remote destinations, make cPanel’s backup system one of the most robust on the market.
AutoSSL and SSLs ordered through the cPanel Market Place will see a much higher success rate with this version. cPanel has improved its Domain Control Validation (DCV). DNS-Based validation and Ancestor-based (or primary-domain) validation reduce the work required of a user to get an SSL.
“cPanel & WHM Version 74 shows what can happen when you stay focused on what your customers need,” said Ken Power, Vice President of Product Development. “Across-the-board performance improvement, features focused on the evolving hosting market and tools that make server management a breeze. These changes and more make cPanel & WHM Version 74 our best release to date.”
As server administrators move to adopt cPanel & WHM Version 74, cPanel will see nearly 80 percent adoption across all cPanel & WHM servers. Take a look at the feature highlights on the cPanel & WHM Version 74 Release Site and read the full Version 74 release notes in the cPanel & WHM Documentation.
Since 1997, cPanel has been delivering the web hosting industry’s most reliable and intuitive web hosting automation software. This robust automation software helps businesses thrive and allows them to focus on more important things. Customers and partners receive first-class support and a rich feature set, making cPanel & WHM the Hosting Platform of Choice. For more information about cPanel, visit https://cpanel.com.
Seattle, WA – Colocation Northwest, a division of IsoFusion, a leading provider of Internet access and IT services in the Puget Sound region, announced today it completed upgrades to its Bellevue Washington data center to allow for one megawatt of service capacity. The data center, located in Bellevue Washington’s Eastgate neighborhood, completed a four-fold capacity increase from 250 kilowatts to one megawatt. The expansion included a state-of-the-art data center wing, immediately occupied by a global top-5 video game publisher and distributor.
The data center features state-of-the-art power, cooling, fire suppression and security. To comply with the City of Bellevue’s sound impact concerns, Colocation Northwest installed Liebert DSE zero-water cooling units, MC Outdoor Condensers and Econophase pumped refrigerant economizers. Not only is this system whisper quiet, it significantly improves data center efficiencies.
“Our Bellevue colocation facility is open to any business requiring safe and reliable lease space for its IT equipment”, said Stephen Milton, Colocation Northwest CEO. “The data center expansion offers Eastside businesses a previously unavailable combination of computing power and proximity.”
The Bellevue data center supplies Tier I connectivity, the highest level of data and internet communications via Colocation Northwest’s redundant fiber network. Colocation Northwest operates a proprietary 180Gbps dark fiber loop that circles Lake Washington and interconnects all of the company’s data centers with additional network fiber routes to Colocation Northwest’s California locations and transpacific network access points to Asia and Pacific ring networks. Users can use this fiber loop to access more than 250 carriers and service providers, including high-speed international access.
The Colocation Northwest Bellevue data center offers powerful computing for Eastside businesses in their own community with the option for systems management and connectivity to Colocation Northwest’s west coast redundant network of data center locations. Coupled with its expansion last year into the Centeris South Hill data center campus, Colocation Northwest continues to strengthen its ability to service enterprise and high-density customers with the most reliable, scalable, fully managed colocation solutions and facilities on the west coast.
About Colocation Northwest and IsoFusion:
Colocation Northwest, a division of IsoFusion Inc. is one of the largest privately held ISP and Colocation providers in Western Washington. Founded in 1991 as ISOMEDIA, IsoFusion is a full solution provider of Internet related products and services that cater to businesses, as well as providing complex solutions to companies with a national presence. A competitive local exchange carrier (CLEC) in the state of Washington, IsoFusion offers a full range of services providing everything from commercial Fiber and Ethernet connections, custom Fiber to the Home (FTTH) community solutions, cloud, hosting and dedicated server options, managed data center colocation services and technology consulting for businesses. IsoFusion provides managed enterprise solutions and outstanding service to prominent Northwest businesses. IsoFusion is Headquartered in Seattle Washington and provides exceptional quality, value, and service to over 23,000 residential and business customers across the We st Coast. For more information visit http://www.colocationnorthwest.com or http://www.isofusion.com.
South Bend, IN – HostColor.com (HC), a web hosting provider since 2000 has recently launched a new service named Managed Website which includes technical and system administration. The difference between regular self-managed Web Hosting and the Managed Website is, that the first one is a do-it-yourself service. The owners of a self-managed web hosting accounts usually receive login credentials for a their control panel and account management system. The controls panel is used for configuration and automation of various services such as http/https, email, antispasmodic, antivirus, databases, ftp, storage space and others, delivered from a shared server environment.
What is Managed Website?
Managed Website is an IT hosting service that uses the same technology environment as the self-managed web hosting. There is one major difference, however. The Managed Website comes with a technical and system administration. This means that the website owners no longer need to learn how control panels and all other website services work, in order to be able to configure them to their custom settings.
In other words, the Managed Website makes possible for the individuals and small business owners to save a lot of time by outsourcing all technical workloads to Host Color. All HC Managed Website Hosting plans feature 1 hour of technical and system administration by default. The website owners can extend the managed service on demand.
“The average site owner who manages a website based on the most popular LAMP (Linux, Apache, MySQWL, PHP) server environment, usually spends between 8 and 16 hours per month on technical administration. Those are numbers which we have received from a survey, we have conducted among our clients who use the Self-Managed Web Hosting services”, says Dani Stoychev an Operations Manager at HostColor.com. He has also added that a lot of people who spend more than 8 hours per month working with the cPanel control panel and on the backend of their websites, usually like doing that and this is the main reason for them to deal with a technical administration. “Still, there are many small business owners who simply do not calculate the time spend on managing their websites and hosting accounts. At the end of the month many of them usually find that they have lost 10 hours on average on dealing with technical issues, instead of spending more time on developing and growing their business”, adds the HC Operations Manager.
Those who create and manage websites need a complex knowledge on Linux OS, web servers, databases, various Open Source frameworks, hosting automation control panels, CMS and various platforms and applications. Even the web designers and digital agencies who usually specialize in building websites on certain CMS, aren’t usually familiar with the server-side workloads, website maintenance and troubleshooting.
“There are 3 major reasons the website management work to be shifted to the infrastructure providers”, says Dimitar Avramov, the CEO of HC. “Web hosting companies are open 24/7; Their Support teams have the knowledge to manage live versions of the websites; Managed web hosting providers charge less for maintenance than the digital agencies.”
HC has listed five different Managed Website plans on tis website. Each of them features different amount of computing resources. However, all of them come with “1 hour of Managed Service” per month. The company’s customers can add as many hours of technical and system administration as they need by paying a reasonably low per-hour fee. There are also monthly Managed service packages available at special pricing for commitment of 4 hours or above.
The Managed Website includes a Domain Validated SSL certificate, a dedicated IP address, a certain number of hosted websites, an enterprise-grade SSD storage with build-in data protection, an uptime monitoring and most importantly an onsite SEO service (website and content analysis). The HC customers have the flexibility to use and subscribe for the Managed Website hosting service on monthly basis.
About Host Color
Host Color- AS46873 – is a web hosting provider since 2000. It operates a fully-redundant, 100% uptime network and peers to more than 70 quality Internet networks and ISPs. The company is also a Cloud Computing service provider and delivers Public, Private and Hybrid Cloud servers from various data centers across North America. Its main data center is based out of South Bend, Indiana, 90 miles from Chicago. HC also provides disaster recovery, Colocation and Dedicated Hosting in Europe through Host Color Europe.
CHICAGO, IL – ServerCentral, a leading provider of managed IT infrastructure solutions, today announced its VMware-powered Enterprise Cloud has received compliance certification with the Payment Card Industry Data Security Standard (PCI DSS).
“When there’s a chance to improve security, we want everyone to have it — no extra fees, no migration. If you work with ServerCentral, you get the best protection we can offer as soon as we can offer it,” said Tom Kiblin, Vice President of Managed Services at ServerCentral. “PCI DSS certification — the industry gold-standard for securely processing credit cards — is no different.”
ServerCentral’s annual AT-101 SOC 2 Type II audit serves as a foundation for reaching the latest PCI designation (version 3.2.1) and for helping customers meet their own compliance requirements:
“We really didn’t have to change a lot, which is a testament to how seriously we take security,” said Joe Johnson, IT Director and Compliance Officer at ServerCentral. “We were already following the proper protocols; we simply needed to organize our existing processes in the manner required for the PCI Security Standards Council.”
ServerCentral services that are also in scope for PCI DSS compliance include Colocation and Disaster Recovery. ServerCentral is constantly working to expand PCI DSS coverage across its Private Cloud, Hybrid Cloud, and Dedicated Server services.
If your application requires PCI DSS certification or you are looking for help with your PCI DSS compliance requirements, visit https://www.servercentral.com/compliance to see how ServerCentral can help.
ServerCentral’s Enterprise Cloud is architected for applications with premium performance requirements. To learn more, visit https://www.servercentral.com/enterprise-cloud/ and https://www.servercentral.com/compliance/.
ServerCentral is a managed IT infrastructure solutions provider. Since 2000, leading technology, finance, healthcare, and e-commerce firms have chosen ServerCentral to design and manage their mission-critical IT infrastructure in data centers across North America, Europe, and Asia. Whether it’s colocation, managed services, or a cloud, ServerCentral designs the optimal solution for each client.
About the PCI Security Standards Council
The PCI Security Standards Council (PCI SSC) leads a global, cross-industry effort to increase payment security by providing industry-driven, flexible and effective data security standards and programs that help businesses detect, mitigate and prevent cyberattacks and breaches. Connect with the PCI SSC on LinkedIn.
Wondering how to optimize your marketing images for different social media platforms? Looking for a guide to help you make sure your image dimensions are correct? In this article, you’ll discover a guide to the optimal image sizes for nine of the top social media networks. #1: Facebook Image Sizes Facebook is a starting place [...]
The post Social Media Image Sizes for 2018: A Guide for Marketers appeared first on Social Media Examiner.
Have you ever put off launching a project because you don’t have all your I’s dotted and all your T’s crossed? Then watch the Journey, Social Media Examiner’s episodic video documentary that shows you what really happens inside a growing business. Watch the Journey In episode 3, Michael Stelzner challenges his marketing team to launch [...]
The post Plan or Act? The Journey, Season 2, Episode 3 appeared first on Social Media Examiner.
Google Ads, formerly known as AdWords, has a couple of different bidding strategies you can pick. They are essentially the difference between manual bidding and automatic bidding, though they have a little big of nuance you won’t grasp if that’s all you consider. Let’s talk about them, and the situations wherein each is better.
Before we go too deep, though, I need to put an answer right up front. I know the title of this post indicates that one might be better than the other, but you should know by now that this is never going to be the case. If one strategy performed much worse than the other in every situation, Google would have long since killed it off. The fact that both still exist should tell you that they’re both viable, in different ways. Indeed, that’s my goal with this post: to tell you how each one can be viable, and help you pick the one you should use.
The core of each option is pretty simple, so I can give you simple definitions of each.
Manual CPC Bidding is, as the name implies, manual bidding. You select an ad campaign, an ad group, or an individual keyword for an ad, and you set the bid you’re willing to pay for that specific entity. It’s a sort of fallback option for when you’ve hit upon a strategy that the automatic bidding doesn’t replicate. It’s also an option for when you want specific control over everything, with no uncertainty. If you’re operating at the very limits of your budget and don’t want any undue surprises, manual bidding can help prevent any such surprises from cropping up.
Target CPA Bidding is one of several “Smart Bidding” strategies offered by Google Ads. They also have target return on ad spend bidding, bidding to maximize conversions, and enhanced cost per click bidding. I’m mostly ignoring those other options for now.
Target CPA bidding is an automatic smart bidding strategy that uses Google’s immense array of performance data to optimize bidding to reach a specific CPA. This can be useful if you may have a lot of conversions available at a high price point, but you want to target a lower price point and figure out what makes that audience different.
When Each Bidding Strategy Works Best
Once you know what each bidding strategy is, you can determine when to use one or the other. In general, both can be fairly useful, even on the same ads as you adapt to data that comes in.
As far as Manual CPC Bidding is concerned, it offers a much greater degree of control over your individual ads, ad groups, and campaigns. If you’re the kind of marketer who likes to make micromanagement changes to optimize every penny you put into your ads, manual bidding is the way to go.
Manual bidding also works best when you’re trying to get as many clicks or simple impressions as possible. You can set the right level of cap to prevent the expensive, less valuable clicks, while spreading that budget to as many clicks as possible. Of course, if you’re an experienced marketer, you know that the sheer volume of clicks is not necessarily the best optimization strategy. It is, however, a good way to harvest a lot of data that you can then use for future optimizations, both in positive trends and in negative aspects to avoid going forward. For example, if you find that an exceptional number of clicks – with a very low conversion rate – are coming from a specific demographic, you can cut that demographic out of future ads.
Manual CPC bidding is also a good option when you want to get impressions from a specific domain search, but don’t need to spend to get the top position for that search. Figure out a level of bid necessary to get a visible placement and go from there.
Manual CPC is excellent for when you have a smaller, more limited budget. You can set lower caps than might be recommended by the automatic bidding engine, which lets you spread your money around more. This isn’t always going to get you the best possible return on your investment, as anyone who has bought cheap traffic from Fivver can attest, but again, it can get you data that you can use for future refinement of all aspects of your ads.
Manual CPC bidding can work in reverse as well. If you have a high budget and want to maintain position in the top ranks, you can set a bid level adequate to reach that level. Automatic bidding has a tendency to cap your bid a little lower than you might otherwise like, in an attempt to save you money and spread that money out more without over-spending on those top spots. However, if you’ve done experiments and have determined that the extra expense is worthwhile in terms of conversions, setting the higher manual bid can be important.
On the other hand, for automatic Target CPA Bidding, you can flip many of those on their head. It’s not quite opposite, but more of a complimentary situation.
The way target CPA bidding works requires an initial investment. It’s an automatic bidding strategy. If you use automatic bidding for a brand new ad with no historic data, Google will be forced to estimate based on other similar ads in the past. Even very similar ads may have performed in very different ways, however, so this data won’t be accurate. Google will require your ads to reach a certain level of conversion, around 15 conversions, before the data is refined enough to make smart decisions.
Therefore, automatic target CPA budding is less effective when you’re running brand new ads, but more effective on ads that have some data behind them.
Automatic bidding is useful for when you have a specific target CPA you want to reach as well. For example, if your CEO decides that he doesn’t want to pay for conversions if they’re over a certain cost, you can set that as your cap for automatic CPA bidding, and it will prevent you from over-spending on your conversions.
The other situation where target CPA bidding is better is when you have an immense number of ads you’re managing. Large businesses with old ad accounts may have racked up thousands, tens of thousands or even hundreds of thousands of different keywords and targeting factors for a vast array of ads.
Even the best, most effective marketer is going to have a hard time micromanaging the manual CPC bid for every one of those keywords. It’s infinitely more effective, in a time investment standpoint, to set automatic CPA bidding for such a large expanse of advertising.
This way, you can have most of your ads managed automatically, and handle the outliers with manual bidding. In fact, my recommendation is to start every ad with manual bidding, then once it has reached a certain conversion level, make a decision. If it’s a moderately effective ad, switch it to automatic bidding. If it’s not performing up to par, cull it and start over. There’s no reason to set automatic bidding to optimize an ad that isn’t performing before you begin, right?
Taking Google’s Recommendations
Remember that Google has access to all of the data recorded for all of the ads ever run through the Google Ads platform. That’s an incredible amount of data. Google uses it to inform the decisions of their automatic bidding, but they also use it to recommend bid caps and bid levels for ads you’re starting to create. Should you trust those recommendations?
I say yes, to an extent. Google knows a rough level where your ads will perform, but the numbers aren’t always going to be ideal for you. You might want to set a slightly higher cap, up to 15-20% higher than Google’s recommendation. A higher cap will allow you to obtain more initial data, which you can use to refine your ads moving forward.
On the other hand, you may want to set a cap of up to 15-20% lower than the Google Ads recommendation. Why? If you have a limited budget, or if you want to analyze specifically the cheaper end of your ad spectrum, setting a lower cap will give you the data you need without over-spending. It’s very easy to accidentally set too high a bid cap and blow through a budget in a few days instead of a few weeks.
Due to the way the auctions work, costs can vary quite a bit even within specific bid ranges. It’s best to set what you’re willing to spend, and allow the system to charge only what you’re able. Don’t over-spend on the assumption that you’re going to get usable data or effective customers out of it. Often, you just end up raising your average cost per customer acquisition.
Speaking of adequate data, how long should you run manual bidding before switching to automatic? Most recommendations I’ve found indicate that you want to let manual bidding run for about a month. For high volume, high bid keywords, you may consider a faster switch, so long as you have enough conversions. 15+ conversions will give you a good baseline, though if you’re operating on a larger scale, you want considerably more. If 15 conversions is a few hours of ads for a your large business, it’s not likely to be an adequate sample for informing future ads. Let them run a while longer before you switch over.
Other Sorts of Bidding
I mentioned in passing that there are other bidding strategies. Let’s briefly talk about each.
The smart bidding option includes more than just target CPA bidding. You can also target a specific return on ad spend, which is very useful if you have a variable value for a single conversion. You can optimize for those larger orders rather than the volume of smaller orders you might otherwise get.
You can also target to maximize conversions rather than optimizing costs. This is a great option for when you want as many conversions as possible and have the budget to support it, but it’s a terrible option for limited budgets who want cheaper conversions.
There’s also an enhanced cost per click bidding, which works sort of like manual CPC bidding, but done by the Google algorithms instead of your own manipulation. This can be a good way to split the difference between the two options, but might not be as effective as your own manual bidding strategies.
You can use a bidding strategy to maximize clicks as well. As the name implies, this will get you as many eyes on your landing page as possible, but is more of a CPM strategy than a CPC strategy. It gets you as many clicks as possible for your budget, but doesn’t care if those clicks result in conversions. For obvious reasons, this can be less than effective.
Google Ads also have a series of options for impression-focused bidding. You can bid to target a specific position for ads, including just the top spots, or the in-search rather than sidebar positioning. You can also choose a specific competitor’s domain you want to out-rank and bid to out-rank them. This can work well if they’re working on a limited budget, but has the potential to run out of control if you both set each other as out-bid strategies. This will maximize both bids until one reaches their cap, which can waste a lot of money.
There are also the typical CPM options, including a target CPM or a viewable vCPM, both of which are only useful when you want eyes on your ad, but don’t really care about clicks. If you have a very, very compelling ad, this can be a great option, but more often than not a CPA or CPC strategy will work a lot better.
The post Google Ads Target CPA vs Manual CPC – Which Is Better? appeared first on Growtraffic Blog.
Want to create social media images without purchasing expensive software? Looking for a solution you can access from any computer? In this article, you’ll learn how to create professional-looking images using Google Drawings. Why Use Google Drawings? Google Drawings was originally built to help businesses create flowcharts, website wireframes, mind maps, and other types of [...]
The post How to Create Social Media Images With Google Drawings appeared first on Social Media Examiner.
Do you host online events, webinars, or product launches? Wondering how to incorporate Facebook events into your marketing strategy? In this article, you’ll discover how to create and host a virtual Facebook event. Why Host a Virtual Facebook Event? A virtual Facebook event allows you to actively engage Facebook users without having (or in addition [...]
The post How to Add a Facebook Virtual Event to Your Launch Strategy appeared first on Social Media Examiner.
Looking for ways to optimize your Facebook ads to acquire more customers? Wondering how to scale campaigns that are working well? In this article, you’ll discover four ways to reduce your customer acquisition costs when scaling your campaigns. Set Up Your Ads Manager Dashboard to Assess Customer Acquisition Costs If you’re new to Facebook advertising, [...]
The post 4 Ways to Reduce Customer Acquisition Costs With Facebook Ads appeared first on Social Media Examiner.
Oh no, another pollster at the front door! Will his survey give a true picture of the actual opinions or the participants? That depends on many variables. It’s the same with studies and tests we do for digital marketing and SEO.
In this episode of our popular Here’s Why digital marketing video series, Stone Temple’s Eric Enge explains the marketing value of doing studies and tests, and offers some tips on how to do them well.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
Content, Shares, and Links: Insights from Analyzing 1 Million Articles
See all of our Here’s Why Videos | Subscribe to our YouTube Channel
Mark: Eric, can you share with our viewers the reason why Stone Temple does so many research studies?
Eric: I’d be happy to do that, Mark. We actually get a ton of benefits from them.
First of all, they give us better knowledge about how things work in the industry, and we think that’s really invaluable.
They’ve also led to many speaking opportunities for both of us, and helped enhance our brand reputation, and visibility. Those are all really good things to do. Frankly, it’s led to new business for us, which is great.
The studies also helped us build major media relationships, which feeds more into our brand reputation and visibility.
Finally, I can’t change the fact that I’m an SEO guy, and we get more links as a result of the studies, which is really cool.
Mark: We won’t turn those away.
Eric: Yes, exactly.
Mark: But why should other businesses consider doing this type of research and publishing it?
Eric: Any business, I think, can get this benefit. It really helps to drive brand reputation and visibility. And, of course, there’s the links to your site. The thing is that research studies play a unique role in that regard.
A study was done by Moz and BuzzSumo that you and I have cited many times, where they looked at a million articles online to see what the correlation was between links and social shares. The idea is, if you get content that gets both of those, there’s a pretty good chance that it’s authoritative and it’s the real deal, rather than if it’s just one metric or the other is working for you.
They found literally zero correlation between social shares and links when looking at the million studies in aggregate.
But when they actually narrowed that down to some very specific areas and different groups of sites to find out which ones actually showed a strong correlation, they found two remarkable things. Major media sites that put out opinion-forming journalism get a very strong correlation between shares and links. That’s kind of cool to know, but most of us are not major media sites, so maybe that won’t work so well for us.
Although, it’s a good idea to work to build your reputation to a point over time where you actually do have the kind of reputation where people care about your opinion. But you’re not going to get that on day one, even if you’re an established brand; it’s not instantaneous.
Eric: The other thing that they saw in the study was that sites that had a lot of research content, like Pew Internet, did extremely well with the correlation between shares and links. And that actually lines up very much with what we’ve seen in our own efforts.
Sites with lots of data-based research studies tend to do better with both link earning and social shares.Click To Tweet
The bottom line of why this is important: if you do really high-quality research and publish it, you can actually earn credibility, whereas with opinion-forming journalism, you have to build the credibility first. So, it can help you get that credibility you’re looking for.
How to Create Research Study Content
Mark: It seems to me what’s probably stopping a lot of people is how to go about picking a topic for a study or a test.
Eric: Really the core thing to do is to start with the right upfront research. You need your topic to be something that is going to stand out and be unique enough and different enough from what other people have done out there that it will attract attention.
Obviously, core to this as well is it has to be of significant interest to the market. Because the market doesn’t care, and there are lots of things that you could publish that are unique where they won’t care.
Then, you have to be able to execute it. You can have the vision and think, “Wouldn’t it be wonderful, I have this data?” But you might not be able to actually execute it.
Finally, you have to be able to bring some credibility to it and, just so we don’t confuse this with the point I was making before, the way you go about executing it is a big part of how you bring credibility to it. You’ve got to be able to do that.
High performing research study content 1) starts with a great topic of study 2) is executed properly 3) comes from a credible sourceClick To Tweet
How to Conduct a Data Research Study
Mark: Let’s focus on that for a moment. What are some things that you think are critical to the execution stage of a study?
Eric: There are a few different things that you really need to think about.
The first is you have to have a solid plan for collecting the data. If you’re doing a survey, then how are you going to get your audience is one thing to think about. Or, if you’re looking at correlations between something like links and social shares, how are you going to collect that data?
You need to know that you can get the data in a clean and pure form. And then, you need to find the right sources to get that data in enough volume so it’s statistically significant. Don’t take one example of something and say, “Oh, I’ve seen this thing happen. When I did this, that happened. I saw it once; therefore, it happens that way all the time.” It has to be in depth and credible.
Then, when you analyze the data, you need to inspect it ruthlessly to make sure it’s clean and it makes sense. We do this all the time in our studies, and I do a lot of this work personally. As I really dig into the data I’m always asking questions. “No, that data doesn’t make sense. Check into it.”
Once you’ve done that, you have to ignore your biases. For example, when you entered this study, you probably had a specific conclusion in mind that you were looking for, but, as I like to say it, you have to let the data tell the story. If you don’t do that and you let your biases interfere, then you’re putting your credibility on the line.
You’ve got to tell the story in a fair and balanced way. Don’t leap off the deep end just because you get something that suggests something might be true. If you can’t do all of those things, people will know. I’ve seen some very reputable organizations and people publish studies where they didn’t follow through on all these principles, and they took a lot of heat for it, and they didn’t get the result they were looking for.
Mark: Great points. Anything else you want to add?
Eric: There’s one other major thing to think about. It’s all well and good to create this wonderful content and publish it, but it isn’t really going to do much for you unless you promote it effectively. You have to invest in your promotion plan as well, and that is something that you want to begin doing well before you publish the study.
Also, you need to know who the people are that you want to get interested in sharing on social media, and you need to know who might write about it. Do research on all of these. Ultimately, use social media to reach out to those influencers. Then reach out to the media people before it publishes.
When you do this kind of outreach, please, please, please don’t use a minimally-customized e-mail sent out as a blast to a bunch of people. Customize every single piece. I promise you, your effort per yes will go up if you send fewer pieces and do a lot more customization. It’s incredibly important. But then you have to start this process of outreach at least a week or so before the study publishes.
Here’s a quick hack. Reach out to a limited number of media people and offer them the content under embargo. Media people like that. They’re getting privileged access in advance, which increases the chance that they’re going to write about your content.
Mark: Thanks, Eric.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
See all of our Here’s Why Videos | Subscribe to our YouTube Channel
Welcome to this week’s edition of the Social Media Marketing Talk Show, a news show for marketers who want to stay on the leading edge of social media. On this week’s Social Media Marketing Talk Show, we explore new Instagram features and Facebook ads updates. Our special guests include Peg Fitzpatrick and Amanda Bond. Watch [...]
The post Facebook Ads Updates and New Instagram Features appeared first on Social Media Examiner.
Want more visitors to your website? Wondering how Pinterest can help? To explore how to drive more traffic to your website with Pinterest, I interview Jennifer Priest. More About This Show The Social Media Marketing podcast is designed to help busy marketers, business owners, and creators discover what works with social media marketing. In this [...]
The post Pinterest Strategy: How to Get More Traffic From Pinterest appeared first on Social Media Examiner.
REDWOOD CITY, CA – Equinix, Inc. (Nasdaq: EQIX), the global interconnection and data center company, today announced that its Board of Directors has appointed Charles Meyers to the position of President and Chief Executive Officer, effective immediately. Meyers will also join Equinix’s Board of Directors. He succeeds Peter Van Camp, who has served as interim CEO since January 2018. Van Camp will resume his role as Executive Chairman of the Equinix Board of Directors.
“Charles is an outstanding leader who has been a major contributor to Equinix’s success over the past eight years, playing critical roles in the company as we have quadrupled in size, growing from $1.2B in revenue to the $5B plus we expect to generate this year,” said Peter Van Camp, Executive Chairman for Equinix. “Charles brings that rare combination of a world-class operator combined with a passion and drive for strategic innovation. These characteristics, and his proven track record of delivering value for our customers and our shareholders, make him an excellent choice to successfully implement our strategy and take advantage of the market opportunities ahead.”
Meyers has a distinguished 25-year career in the technology industry including a number of executive leadership positions at leading telecommunications and information technology companies. Meyers joined Equinix in 2010 as President, Americas, leading the company’s largest operating region through a time of significant growth and strong operating performance. In 2013, he was appointed Chief Operating Officer at Equinix, where he spearheaded our drive for global consistency, leading the Global Sales, Marketing, Operations and Customer Success teams. For the past year, he has served as President of Strategy, Services and Innovation (SSI), where he oversaw our product organization and led the technology, strategy and business development teams driving the company’s next phase of growth and focusing on the future needs of customers and partners.
“I joined Equinix eight years ago to be part of a company with exceptional opportunity and a phenomenal team of employees and I am incredibly honored and thrilled to now serve the company in the role of CEO,” said Meyers. “I look forward to partnering with my over 7,500 colleagues around the globe in service to our customers and partners as we expand our unmatched ability to help organizations drive their digital transformation agendas. The opportunity that lies ahead for Equinix is enormous and I am committed to driving our innovation, strategy and execution to become the trusted center of a cloud-first world.”
Equinix, Inc. (Nasdaq: EQIX) connects the world’s leading businesses to their customers, employees and partners inside the most-interconnected data centers. In 52 markets across five continents, Equinix is where companies come together to realize new opportunities and accelerate their business, IT and cloud strategies.
NEW YORK, NY – Webair, a high-touch, agile Cloud and fully managed infrastructure service provider, today announces the availability of its Backups-as-a-Service (BaaS) solution for customers’ Microsoft Office 365 data. This solution addresses customers’ growing requirements for a single-provider solution to fully manage their business continuity and disaster recovery requirements, which span a range of platforms including Software-as-a-Service (SaaS).
As a growing number of enterprises migrate toward hybrid IT strategies, many SaaS applications such as Microsoft Office 365 have become critical to companies’ core operations. Webair’s BaaS solution for Microsoft Office 365 enhances the automatic data replication Microsoft provides across its data centers and helps ensure availability to users across Microsoft Office 365, SharePoint and OneDrive.
Webair’s BaaS solution for Microsoft Office 365 integrates the company’s Off-site Backups orchestration platform with Veeam’s Data Mover component to provide comprehensive backup services that enhance the flexibility and resiliency of critical data with the ability to access and restore information at any time. Webair’s CloudPanel provides customers with full transparency into their environments as well as multiple options for data recovery, including: restoring data to Office 365 or local / client-owned exchange servers and exporting raw data directly or to .PST files.
The solution allows customization of backup policies and data retention periods per job, user or group, as well as the ability to mix-and-match between Office 365, SharePoint and OneDrive. This unique capability enables users to meet specific SLA requirements as well as compliance regulations such as PCI, HIPAA and HI-TECH, which require immutable data, long-term retention capabilities and the ability to restore to specific points in time for both current and former employees. Webair’s BaaS solution for Microsoft Office 365 is also an effective ransomware and malware mitigation tool, providing the ability to restore modified and deleted data in the event of a cyberattack.
“Customers’ production infrastructures are shifting to include a multitude of platforms that span public and private cloud, colocation, on-premise environments, and SaaS. They are looking for a single pane of glass and a central point of accountability for their resilience and assurance across these hybrid infrastructures,” explains Sagi Brody, Chief Technology Officer, Webair. “Webair’s BaaS solution for Microsoft Office 365 provides an added layer of resiliency and security for customers by replicating data to a multitude of potential targets, including those ‘air-gapped’ from the internet as well as those supporting high levels of encryption and compliance. At Webair, it’s important that we provide comprehensive solutions for our customers’ IT needs, and we are very excited to introduce this new offering to our growing portfolio of services.”
As a high-touch, long-term IT partner, Webair takes full ownership of customers’ backups, providing one point of accountability and security for the management, monitoring and recovery of their data. Webair facilities feature superior physical security on premises as well as encrypted connectivity to customer networks covered by Business Associate Agreements (BAAs) and Service Level Agreements (SLAs).
Headquartered in New York for over 20 years, Webair delivers agile and reliable Cloud and Managed Infrastructure solutions leveraging its highly secure and enterprise-grade network of data centers in New York, Los Angeles, Montréal, Amsterdam, and Asia-Pacific. Webair’s key services include fully managed Public, Private and Hybrid Cloud; Customized Networking; Disaster Recovery-as-a-Service; Ransomware Recovery-as-a-Service; Off-Site Backups; DDoS Mitigation; Web and application stacks; and Colocation. Webair services can be delivered securely via direct network tie-ins to customers’ existing infrastructure, enabling them to consume SLA-backed solutions with ease, efficiency and agility — as if they were on-premise. With an emphasis on reliability and high-touch customer service, Webair is a true technology partner to enterprises and SMBs including healthcare, IT, eCommerce, SaaS and VoIP providers. Because Webair focuses on its core value of owning managed infrastructure within its own facilities, it is also an ideal cloud solution provider and business partner for VARs, MSPs, and IT consultants. For more information, visit www.webair.com, or follow Webair on Twitter: @WebairInc, Facebook: facebook.com/WebairHosting and LinkedIn: www.linkedin.com/company/webair.
Veeam® is the global leader in Intelligent Data Management for the Hyper-Available Enterprise™. Veeam Hyper-Availability Platform™ is the most complete solution to help customers on the journey to automating data management and ensuring the Hyper-Availability of data. We have more than 307,000 customers worldwide, including 75% of the Fortune 500 and 58% of the Global 2,000. Our customer satisfaction scores, at 3.5X the industry average, are the highest in the industry. Headquartered in Baar, Switzerland, Veeam has offices in more than 30 countries. To learn more, visit https://www.veeam.com.
ALBANY, N.Y. – Leading Data Center and Cloud Hosting Solutions provider TurnKey Internet, Inc. announced today that their flagship data center in Albany, New York has once again earned the U.S. Environmental Protection Agency’s (EPA) ENERGY STAR certification for superior energy performance. TurnKey Internet’s data center is the 2nd in the state of New York to achieve the highest environmental standards and earn the ENERGY STAR data center certification. TurnKey’s facility utilizes state-of-the-art technology and 100% renewable energy to provide the greenest cloud-based IT services for clients all over the world.
TurnKey Internet runs their ENERGY STAR certified data center with 100% renewable hydro and solar power. It is equipped with rooftop solar panels, and uses hydro-power to further supplement their renewable energy initiatives as beneficiaries of Gov. Cuomo’s ‘Recharge New York’ power program. The data center is SSAE 18 SOC 2 certified, confirming the highest tier of reliability. The facility features low-voltage servers stored in ultra-efficient cold containment pods. These pods deliver precision spot-cooling to temperature-regulated server racks, which use thirty-three percent less energy than traditional data centers.
ENERGY STAR certified data centers and facilities are verified to perform in the top 25 percent of buildings nationwide, based on weather-normalized source energy use that takes into account occupancy, hours of operation, and other key metrics. ENERGY STAR is the only energy efficiency certification in the Unites States that is based on actual, verified energy performance.
“We’re honored to once again earn the ENERGY STAR for superior energy performance at our Albany, New York data center and appreciate the efforts of everyone who has been involved in its efficient operation,” said Adam Wills, CEO of TurnKey Internet. “TurnKey’s green data center was built with sustainability in mind, and our commitment only evolves as the world-wide demand for energy consumption continues to multiply.” He continued to say, “Saving energy is just one of the ways we show our community we care, and that we’re committed to doing our part to protect the environment and public health, both today and for future generations.”
In 2013, The New York State Department of Environmental Conservation awarded TurnKey Internet the Environmental Excellence Award for innovative facility design and outstanding commitment to environmental sustainability, social responsibility and economic viability. For more information about TurnKey Internet’s ENERGY STAR certified data center, or to speak with a Cloud Hosting Solutions expert, visit https://www.turnkeyinternet.net/
About TurnKey Internet
Founded in 1999, TurnKey Internet, Inc. is a full-service Cloud Hosting Solutions provider with Data Centers in New York and California specializing in Infrastructure as a Service (IaaS) to clients in more than 150 countries. Services offered in both East Coast and West Coast, USA – include Public Cloud, Private Cloud, Dedicated & Bare Metal Servers, Backup & Disaster Recovery, Online Storage, Web Hosting, Managed Hosting, Hybrid Solutions and Enterprise Colocation. Headquartered in New York’s Tech Valley Region, TurnKey Internet’s Flagship company owned Datacenter is SSAE 18 SOC 1 & SOC 2 certified, as well as HIPAA compliant with HITRUST CSF certification. The facility is powered exclusively by on-site Solar and Hydroelectric sources to provide a 100% renewable energy footprint and is the 39th ENERGY STAR® Certified Data Center in the United States.
ASHBURN, VA – Aligned Energy, a leading data center provider offering innovative, sustainable and adaptable colocation and build-to-scale solutions for cloud, enterprise, and managed service providers, today announced its new 26-acre, 180-Megawatt master-planned data center campus in Ashburn, Virginia.
When complete, the campus will offer approximately 880,000 square feet of expandable space, drawing redundant, critical power from two on-site substations to service the IT operations of hyperscale and cloud service provider customers.
The campus’ initial 370,000-square-foot, 80 MW facility, followed by a 510,000-square-foot, 100 MW development, will sit atop major fiber and conduit routes, providing access to more than 50 carriers in the immediate area. Both facilities will feature Aligned Energy’s on-demand adaptable and intelligent, dynamic infrastructure, complete with its patented, award-winning data center cooling technology, which is purpose-built to support high, mixed, and variable power densities of 1-50kW per cabinet in the same footprint.
“Our new Ashburn data center campus addresses the needs of cloud providers and hyperscalers that demand a highly dynamic, scalable and future-proof data center solution,” said Andrew Schaap, CEO of Aligned Energy. “Data centers are the new engines of innovation for the 21st century, and we are delighted to provide Northern Virginia with an incredibly efficient and highly reliable colocation data center platform.”
This new campus is a strategic addition to Aligned Energy’s portfolio. With high-capacity, adaptive, and future-proof facilities in Dallas, Phoenix, Salt Lake City, and now Ashburn, Aligned Energy is well-positioned to service customer needs in the country’s fastest-growing data center markets.
Aligned Energy’s mission is to make data center critical infrastructure intelligent enough to continuously improve both its economic performance and environmental impact, delivering a noticeable business advantage. The company’s unique approach to infrastructure deployment allows it to deliver the data center platform like a utility – accessible and scalable as needed. It also reduces the energy, water and space required to operate physical data center environments, significantly improving sustainability and yielding greater water usage effectiveness for customers.
From build-to-scale and customizable services, to rapid power and square footage scalability, to just-in-time provisioning and accelerated delivery schedules, Aligned Energy’s data center platform and business model are uniquely positioned to address the infrastructure needs of today’s hyperscalers and cloud service providers.
About Aligned Energy
Aligned Energy is an infrastructure technology company that offers adaptable colocation and build-to-scale solutions to cloud, enterprise, and managed service providers. Our intelligent infrastructure allows us to deliver data centers like a utility—accessible and consumable as needed. By reducing the energy, water and space needed to operate, our data center solutions, combined with our patented cooling technology, offer businesses a competitive advantage by improving reliability and their bottom line. For more information, visit www.alignedenergy.com and connect with us on Twitter, LinkedIn and Facebook.
Have you ever contemplated stopping something to ensure business growth? Then watch The Journey, Social Media Examiner’s episodic video documentary that shows you what really happens inside a growing business. Watch the Journey In episode 2, Michael Stelzner makes a decision to cut off a segment of his audience he’s been developing for years. He [...]
The post Cutting for Growth: The Journey, Season 2, Episode 2 appeared first on Social Media Examiner.
Wondering if it still pays to advertise on social media these days? Looking for data to show you where you should be investing your ad spend? In this article, you’ll discover insights that reveal how fellow marketers are planning their social advertising and which platforms offer new opportunities for ad placement. #1: Social Ads Are [...]
The post Social Media Advertising: New Research for Marketers appeared first on Social Media Examiner.