Industry Buzz

StudioPress Theme Spotlight: New Breakthrough Pro

WP Engine -

As we continue to showcase a sample of the amazing StudioPress themes now available to WP Engine customers, we’re extra excited to bring you this StudioPress Theme of the Week: Breakthrough Pro, which is the first new theme to be launched since our acquisition of StudioPress in June. Breakthrough Pro was built with today’s creative,… The post StudioPress Theme Spotlight: New Breakthrough Pro appeared first on WP Engine.

How to Sell Products Online in 6 Easy Steps

HostGator Blog -

The post How to Sell Products Online in 6 Easy Steps appeared first on HostGator Blog. Deciding to start an online business and begin selling products online can be an exciting experience. However, this excitement can soon be replaced with overwhelm if the proper process isn’t followed. There are a lot of considerations and research to be done if you want to learn to sell products online the right way. Below you’ll learn the proper steps to take before you launch, during launch, and how to set your online store up for long-term success.   1. Decide What to Sell Choosing the right products to sell will make or break your success online. As a result, you should spend a lot of time during the research phase. It can be helpful to choose a product or market that you actually care about. With more and more competition every single day, choosing a market you have passion about will give you a leg up, as you’ll be willing to go the extra mile. Ask yourself: What kind of products would I love to sell? What would be my dream niche to serve? What industries do I have experience and knowledge in? What pain points currently exist in the market? Do my products provide a practical solution? This should give you a list of products or markets that you’d love to serve. With this in mind it’s time to get a better picture of the existing market, so you can decide how to compete and position yourself.   2. Research Your Market You probably already have an idea of some of the competitors in your space, but now it’s time to take a deeper dive. You’ll be looking for companies that sell similar products, what makes their approach unique, the methods they use to market themselves, and how they speak to your target market. Find your top competitors and make a list with the above elements in mind. This will not only help you better understand how to market and sell your products, but you might be able to uncover an underserved portion of the market hungry for your products. Beyond having a deep understanding of your market, you’ll also want to thoroughly understand your customers. This will make the sales and marketing process much easier. Ask yourself the following questions: How old is my customer? Where do they live? What’s their gender? How much money do they make? What’s their occupation? What other interests do they have? How do they spend their time? What are their beliefs about the world? Why do they buy products like yours?   3. Decide How to Ship Your Products With an idea of what you’re going to sell, the existing market, and your buyer preferences, it’s time to think about how you’re going to ship your products to them. The first is hiring a manufacturer to create your products for you. This can lead to a more custom product, higher quality control, and less cost per unit. But, you’ll have to spend more time creating your product, working out manufacturing issues, and figuring out shipping. The second approach is relying on dropshipping. With this approach, you’ll be purchasing other people’s products and selling them through your online store. The drop shipper will also fulfill and ship orders on your behalf. This approach will have lower overhead costs, and less work overall. However, you may have to operate on slimmer margins and will have less quality control over the final product.   4. Build Your Online Store Now it’s time to start building your online store. You’ll have a few different approaches to take. You can build your own online store through WordPress and a tool like WooCommerce. You can sell products through an existing platform like Etsy, or Amazon. Or you can use an eCommerce website builder to easily build your store and manage your products. For the sake of this tutorial, we’re going to assume you’re using a website builder. This approach will give you the freedom of customizing your own site while helping manage all of the technical details for you. With an eCommerce website builder all you have to do is select a theme, customize it to your liking with the drag and drop builder, upload your products, and press publish. You’ll also be able to manage your inventory, handle tax, and shipping rates, and even integrate a payment processor.   5. Craft a Marketing Strategy Simply publishing your site online isn’t enough; you need to craft a marketing strategy to help get the word out. It would be impossible to cover every single aspect of marketing your online store in this post, but here are a few questions and considerations to get you moving in the right direction: What marketing approaches will you take? Social media? Content marketing? Paid advertising? Influencer outreach? Guest blogging? How will you get customers to buy from you again? A royalty program? Subscriber discounts? How will you convert traffic to buyers? Regular promotions? Product and upsell suggestions? What will make your strategy successful? Rising traffic? Conversions? Email list growth? As you can see you have a lot to think about when it comes to marketing your store and ensuring it’s success over the long run.   6. Launch and Execute The day has come to finally launch your online store and start sharing your products with the world. Even though it probably feels like your work is finished it’s actually only just begun. All of the preparation work, research, and website building has been leading you up to this point. Continue to execute and experiment with your marketing strategy and optimize your site based on user feedback, analytics data, and the kinds of products they’re actually buying. Selling products online is a journey and you’ve just taken your first steps. Hopefully, you’re now better equipped to create and launch a successful online store. Get your store up and running quickly with the GATOR website builder. Find the post on the HostGator Blog

Meet the Newest AWS Heroes (September 2018 Edition)

Amazon Web Services Blog -

AWS Heroes are passionate AWS enthusiasts who use their extensive knowledge to teach others about all things AWS across a range of mediums. Many Heroes eagerly share knowledge online via forums, social media, or blogs; while others lead AWS User Groups or organize AWS Community Day events. Their extensive efforts to spread AWS knowledge have a significant impact within their local communities. Today we are excited to introduce the newest AWS Heroes: Jaroslaw Zielinksi Jerry Hargrove Martin Buberl Jaroslaw Zielinski – Poznan, Poland AWS Community Hero Jaroslaw Zielinski is a Solutions Architect at Vernity in Poznan (Poland), where his responsibility is to support customers on their road to the cloud using cloud adoption patterns. Jaroslaw is a leader of AWS User Group Poland operating in 7 different cities around Poland. Additionally, he connects the community with the biggest IT conferences in the region – PLNOG, DevOpsDay, Amazon@Innovation to name just a few. He supports numerous projects connected with evangelism, like Zombie Apocalypse Workshops or Cloud Builder’s Day. Bringing together various IT communities, he hosts a conference Cloud & Datacenter Day – the biggest community conference in Poland. In addition, his passion for IT is transferred into his own blog called Popołudnie w Sieci. He also publishes in various professional papers.   Jerry Hargrove – Kalama, USA AWS Community Hero Jerry Hargrove is a cloud architect, developer and evangelist who guides companies on their journey to the cloud, helping them to build smart, secure and scalable applications. Currently with Lucidchart, a leading visual productivity platform, Jerry is a thought leader in the cloud industry and specializes in AWS product and services breakdowns, visualizations and implementation. He brings with him over 20 years of experience as a developer, architect & manager for companies like Rackspace, AWS and Intel. You can find Jerry on Twitter compiling his famous sketch notes and creating Lucidchart templates that pinpoint practical tips for working in the cloud and helping developers increase efficiency. Jerry is the founder of the AWS Meetup Group in Salt Lake City, often contributes to meetups in the Pacific Northwest and San Francisco Bay area, and speaks at developer conferences worldwide. Jerry holds several professional AWS certifications.   Martin Buberl – Copenhagen, Denmark AWS Community Hero Martin Buberl brings the New York hustle to Scandinavia. As VP Engineering at Trustpilot he is on a mission to build the best engineering teams in the Nordics and Baltics. With a person-centered approach, his focus is on high-leverage activities to maximize impact, customer value and iteration speed — and utilizing cloud technologies checks all those boxes. His cloud-obsession made him an early adopter and evangelist of all types of AWS services throughout his career. Nowadays, he is especially passionate about Serverless, Big Data and Machine Learning and excited to leverage the cloud to transform those areas. Martin is an AWS User Group Leader, organizer of the AWS Community Day Nordics and founder of the AWS Community Nordics Slack. He has spoken at multiple international AWS events — AWS User Groups, AWS Community Days and AWS Global Summits — and is looking forward to continue sharing his passion for software engineering and cloud technologies with the Community. To learn more about the AWS Heroes program or to connect with an AWS Hero in your community, click here.

Default WordPress Settings That You Need to Change

InMotion Hosting Blog -

Once you get WordPress and theme installed, it can seem that once you add your content, your website is ready to go. Believe it or not, there are several WordPress default settings that could put your website at a security risk. In this article, we want to share the default settings that every WordPress website owner needs to change. Login URL As WordPress has become so popular, brute force attacks have happened as the login URL is so commonly used. Continue reading Default WordPress Settings That You Need to Change at The Official InMotion Hosting Blog.

10 Ways to Speed Up a DreamPress Website

DreamHost Blog -

If you’re anything like us, you’re always looking for ways to improve your website. For example, you’ll want to ensure that your site always runs fast, regardless of the number of visitors it receives. Failing to optimize your site’s speed can have dire consequences, such as higher bounce rates and lost income. Fortunately, speeding up your site doesn’t have to be difficult, especially if it’s hosted with DreamPress. More often than not, you will only need to perform a few simple tasks to fully optimize your site. Even if you’re no web wizard, you can still guarantee that your site will run fast and smoothly. In this article, we’ll start by discussing the importance of having a fast site. We’ll also cover 10 methods you can use improve the speed of your DreamPress website right now. Let’s get started! Why Speeding Up Your Website Is Important As you might know, we’ve discussed the importance of website speed many times before — with good reason. If your site is suffering from slowdowns, it can both negatively affect your users’ experience and harm your business financially. The fact is that most internet users have come to expect sites to be fast, and when they’re faced with long loading times, they’re more likely to leave before performing any actions. The number of visitors who leave this quickly is referred to as your site’s bounce rate and is something you’ll want to minimize. A site suffering from slow speeds is also likely to see fewer conversions and even a loss in profits. As such, the importance of speeding up your website should be evident. However, to do this, you’ll first need to know how well your site is currently performing. How to Test Your Site’s Performance It’s a smart idea to regularly test your site to see how quickly it loads and how well it manages increases in traffic. Even if you’ve optimized your site for speed in the past, it may have slowed down over time, so you’ll want to stay up-to-date with its performance. Fortunately, testing the speed and performance of your site is much easier than you might expect. In fact, there are plenty of solutions you can use to do this right in your browser. Pingdom is a free tool that lets you enter the URL of the site you want to test and select a server location. You can then run a test, which will usually take less than a minute. Once that’s done, you can see how quickly your site managed to load from the specified server. Your site will get an overall score, and you’ll see how it compares against other sites. In this case, the tested website was faster than 37 percent of other sites and was given a ‘C’ score. That means this site could definitely use some optimization. It’s a smart idea to run this test a few times and to use a variety of servers. Best of all, Pingdom’s results will even point out specific areas where you can optimize your website to make it faster. 10 Ways to Speed up Your DreamPress Website Once you’ve tested your site and found areas where it could improve, you can get to work. In this section, we’ll look at ten of the ways you can diagnose why the site is slowing down and how you can optimize it for speed. Before we get started, we should mention that one of the factors in your site’s performance is its hosting plan. An optimized, WordPress-specific plan such as DreamPress will do a lot to keep your site fast and stable. However, that doesn’t mean the following methods can’t improve its performance even further. Let’s jump right in! 1. Check Caches Caching plays a vital part in making sure your site loads quickly, which is something we’ve covered in our complete guide to caching. The good news is that all DreamPress sites include built-in caching already, so you don’t need to worry about installing a solution yourself. However, even with DreamPress’ caching functionality, you’ll still need to manage your site’s cache from time to time. You can do this with the Varnish HTTP Purge plugin. This plugin is included on all DreamPress sites, and it automatically clears your cache when you post new content. As such, it prevents your site from displaying outdated files to visitors. Varnish HTTP Purge also includes a tool to test your caching so you can make sure it’s working correctly on your site. You can access this option from Varnish > Check Caching. Here, you can enter a URL for a page on your site to test its caching status. The result will show any errors, and highlight if your theme or plugins are causing problems with the cache. You can now go through these results to find areas that conflict with your cache. For more information, see our guide to managing the DreamPress cache. 2. Combine and Minify Scripts and Stylesheets Minification is the process of removing unnecessary content from HTML, JavaScript, and CSS code to make your site run faster. This might sound complex if you’re not a coder, but it’s not hard to understand why this works. In a nutshell, most code is written not just to be functional but also to be easily readable by humans. This results in excess information that isn’t strictly necessary. By minifying the code, you can keep the functionality intact but make it much faster for computers to read and run. Several plugins can help you do this. For example, Autoptimize will automatically ensure that the scripts on all of your pages are optimized. What sets this tool apart from many other minification plugins is that it also optimizes wp-admin. All you need to do after installing and activating the plugin is navigate to Settings > Autoptimize. Here you can decide what you want the plugin to do. We recommend that you start with as few options as possible, since selecting too many may cause issues with your site. As such, you may want to simply enable optimization for HTML, CSS, and JavaScript, and then save your changes. 3. Compress and ‘Lazy Load’ Your Images Your site likely contains a fair amount of images, which is great for your site’s look but can be a real problem with it comes to loading times. Many image files, particularly if they’re physically large or high-quality, can be very heavy. This is a common cause of slowdowns. To avoid this issue, take care to optimize and compress images. This will severely cut down on file sizes. There are several browser and downloadable tools you can use for this, such as TinyPNG, which also compresses JPG files. However, to make things even easier, you can install an image compression plugin. This will automatically decrease the size of any images you upload, including their thumbnails. You can even set the level of compression you want. One such plugin that we have recommended in the past and still favor is ShortPixel. ShortPixel is very easy to use and offers bulk optimization of your existing images. You even get a certain number of free optimizations every month. This makes it an excellent choice, particularly if you want a quick plug-and-play solution. Another way you can improve your images is by implementing a ‘lazy loading’ solution. With this feature in place, only images that are currently visible on the screen will be loaded. That can help to speed up your pages, particularly if they contain a large number of visuals. As you might expect, there are a number of tools available to help you implement this functionality. For example, the popular Jetpack plugin contains a lazy load feature for images. There are also dedicated plugins, such as the aptly-named a3 Lazy Load. This plugin is a simple yet powerful option. It will ensure that all images (including avatars, thumbnails, and those inside widgets) are only loaded when a user scrolls down to their locations on a page. Another handy alternative is Crazy Lazy. While this plugin features most of the same functionality, it is very lightweight and easy to use. Regardless of which option you choose, you should find that image-heavy pages will be faster and easier to navigate as a result. 4. Review Plugins While plugins are incredibly useful for customizing and expanding your site’s functionality, some can hurt its performance. If your site is starting to slow down, a good place to start diagnosing the problem is by checking plugins. Some plugins can hog a lot of server resources or simply take up too much space. If you’ve recently added a plugin to your site, try disabling it to see if that helps bring your speeds up. You should also see if any of your plugins have been recently updated, which could be a cause of the problem. Plugins can also interfere with your site’s caching, which we discussed earlier. To find out if this is happening, you can try running a caching test with Varnish HTTP Purge to see if it finds that one or more plugins conflict with your cache. If you find that a plugin is the root of your performance problems, you may want to look for a more lightweight alternative. It might also be worth reaching out to the plugin’s developers, as the performance issues may be due to a bug they’d like to be aware of. 5. Use a Fast Theme When you choose a theme for your site, you’re most likely focusing on its appearance and features. However, this could result in picking a low-performance theme that drags down your site. As such, you should always check a theme’s user reviews first to see if other people have encountered speed issues. You can also test to see if your theme is slowing down your site. This can be done by temporarily replacing it with one of your site’s default themes. Just access Appearance > Themes, and click on Activate next to one of the basic WordPress options. Now, test your site’s speed again to see if you notice a significant difference. If so, you may want to look for a more optimized theme as a replacement. Finally, you should also try to find a theme that supports Accelerated Mobile Pages (AMP), as this will help boost your site’s speed on phones and tablets. This is incredibly important since many of your users will be visiting your site using mobile devices. DYK? DreamPress Plus and Pro users get access to Jetpack Professional (and 200+ premium themes) at no added cost! 6. Optimize Your Database One element that it’s easy to forget about is your site’s database. While this is a vital component of every site, you rarely have to worry about it. However, if you leave your database alone for too long, it can fill up with old or unnecessary data, causing slowdowns as a result. The best way to avoid this, especially if you’re unfamiliar with databases, is to use a plugin. In this case, we recommend WP-DBManager. This is an all-in-one database solution that can repair, back up, and optimize your database. To do the latter, just navigate to Database > Optimize DB. Select all the data tables you want to include and then click on Optimize. It’s a good idea to do this regularly (at least once a month) to make sure your database is always in top shape. 7. Check for 404 Errors Another possible cause of slow loading times is missing files or broken links. For instance, if a server is trying to locate a particular file to no avail, it may be using up precious resources for nothing, hampering your site’s performance in the process. These missing files are usually known as 404s, since they often result in the “404 Not Found” error. There are several common files that can be affected by this problem, such as your site’s ‘favicon’ (the icon that appears in your browser next to your site’s name), your robots.txt file (a file that enables you to exclude specific areas of your site from search engines), and your sitemap. These missing files and broken links can affect your site negatively in many ways. For one, it doesn’t look great to a visitor if parts of your site just aren’t there or your links don’t work properly. Plus, as we mentioned, performance can also suffer as a result. As such, you’ll want to make it a habit to check your site for broken links regularly. One plugin that can help you accomplish this is Broken Link Checker. This is yet another plugin that does most of the work required automatically, without much input needed from you. In short, it will check for broken links and 404 errors across your site and notify you via email or on the dashboard when any issues are found. You can even edit links directly from the plugin’s tab, sparing you the potential headache of having to manage multiple broken links individually by editing the pages yourself. This plugin is a useful solution and is particularly helpful when it comes to large sites with hundreds (or even thousands) of pages. 8. Look for Unusual Traffic Another reason your site can slow down is due to a sudden increase in traffic. Sometimes this is perfectly natural, for example, if one of your posts has gone viral and draws a lot of new visitors to your site. However, it could also be due to more malicious causes, such as a Distributed Denial-of-Service (DDoS) attack. To prevent these attacks, you’ll want to make sure that your site is secure. You’ll also need a way to see if there are particular areas of your site that are receiving more traffic than you would expect. These could represent a potential security vulnerability that attackers are attempting to exploit. Using a tracking solution like Google Analytics can be very helpful in keeping an eye on your site’s traffic. 9. Use Content Delivery Networks (CDN) When someone visits your site, its files usually travel to them all the way from your host’s server. This is true regardless of the visitor’s geographic location or the number of other visitors currently using your site. As you can imagine, this can be quite strenuous for one server to handle, which can lead to longer loading times. The best way to fix this is to use a Content Delivery Network (CDN). A CDN is a network of servers that are spread across the world and all contain copies of your site’s files. This means that when a user accesses your site, its files will be sent from the CDN server closest to them. A CDN minimizes delays due to geographic distance and also spreads the strain across multiple servers. There are a huge amount of CDN solutions for WordPress, each with their own advantages. For example, the popular Jetpack plugin includes a CDN for image files, which can help speed up your site. Jetpack Professional, which is included with the purchase of a DreamPress Plus or Pro hosting plan, also includes a CDN for video. 10. Keep Your Site Updated Last but by no means least, we come to the oldest trick in the book. You’ve doubtless heard this repeated time and time again — including from us — but it’s for a good reason. If you don’t keep your WordPress installation, plugins, and theme updated, they can quickly become sluggish and may cause a number of other problems. Not only can failing to perform regular updates lead to significantly worse performance, but it can also leave your site vulnerable to security issues. Fortunately, WordPress makes updates easy. They’re always clearly highlighted in your admin area, and you can find them under Dashboard > Updates. Here you’ll see if there’s a new version of WordPress or any updates are available for your themes or plugins. You should make sure to check this page regularly to ensure that your site is optimized. Of course, don’t forget to back up your site and consider creating a staging site for testing updates before they go live. Zero to Sixty Speed is one of the most critical elements of a successful website. After all, if a site takes forever to load, very few people will bother to stick around and see the actual content. As such, making sure your website is fast is a top priority. As luck would have it, doing this isn’t even all that difficult. Do you have any questions about speeding up your DreamPress website? Join the conversation in our DreamHost Community! The post 10 Ways to Speed Up a DreamPress Website appeared first on DreamHost.

Ready for cPanel & WHM Version Certification 2018?

cPanel Blog -

cPanel Conference time is closing in steadily, with only 18 days left until the big event. Here on the cPanel University team, we’re continuing our tradition of offering a special certification. This certification is only available to those lucky folks that will be attending the conference in-person, who successfully complete a comprehensive exam. This exam covers the latest and greatest features and changes made over the past year’s worth of cPanel & WHM releases. Last …

How to Get More Engagement With Facebook Live

Social Media Examiner -

Want more people to watch, share, and comment on your live videos? Looking for tips on improving the quality of viewer engagement? To explore how to get more engagement with Facebook Live video, I interview Stephanie Liu. More About This Show The Social Media Marketing podcast is designed to help busy marketers, business owners, and [...] The post How to Get More Engagement With Facebook Live appeared first on Social Media Examiner.

New – Parallel Query for Amazon Aurora

Amazon Web Services Blog -

Amazon Aurora is a relational database that was designed to take full advantage of the abundance of networking, processing, and storage resources available in the cloud. While maintaining compatibility with MySQL and PostgreSQL on the user-visible side, Aurora makes use of a modern, purpose-built distributed storage system under the covers. Your data is striped across hundreds of storage nodes distributed over three distinct AWS Availability Zones, with two copies per zone, on fast SSD storage. Here’s what this looks like (extracted from Getting Started with Amazon Aurora): New Parallel Query When we launched Aurora we also hinted at our plans to apply the same scale-out design principle to other layers of the database stack. Today I would like to tell you about our next step along that path. Each node in the storage layer pictured above also includes plenty of processing power. Aurora is now able to make great use of that processing power by taking your analytical queries (generally those that process all or a large part of a good-sized table) and running them in parallel across hundreds or thousands of storage nodes, with speed benefits approaching two orders of magnitude. Because this new model reduces network, CPU, and buffer pool contention, you can run a mix of analytical and transactional queries simultaneously on the same table while maintaining high throughput for both types of queries. The instance class determines the number of parallel queries that can be active at a given time: db.r*.large – 1 concurrent parallel query session db.r*.xlarge – 2 concurrent parallel query sessions db.r*.2xlarge – 4 concurrent parallel query sessions db.r*.4xlarge – 8 concurrent parallel query sessions db.r*.8xlarge – 16 concurrent parallel query sessions db.r4.16xlarge – 16 concurrent parallel query sessions You can use the aurora_pq parameter to enable and disable the use of parallel queries at the global and the session level. Parallel queries enhance the performance of over 200 types of single-table predicates and hash joins. The Aurora query optimizer will automatically decide whether to use Parallel Query based on the size of the table and the amount of table data that is already in memory; you can also use the aurora_pq_force session variable to override the optimizer for testing purposes. Parallel Query in Action You will need to create a fresh cluster in order to make use of the Parallel Query feature. You can create one from scratch, or you can restore a snapshot. To create a cluster that supports Parallel Query, I simply choose Provisioned with Aurora parallel query enabled as the Capacity type: I used the CLI to restore a 100 GB snapshot for testing, and then explored one of the queries from the TPC-H benchmark. Here’s the basic query: SELECT l_orderkey, SUM(l_extendedprice * (1-l_discount)) AS revenue, o_orderdate, o_shippriority FROM customer, orders, lineitem WHERE c_mktsegment='AUTOMOBILE' AND c_custkey = o_custkey AND l_orderkey = o_orderkey AND o_orderdate < date '1995-03-13' AND l_shipdate > date '1995-03-13' GROUP BY l_orderkey, o_orderdate, o_shippriority ORDER BY revenue DESC, o_orderdate LIMIT 15; The EXPLAIN command shows the query plan, including the use of Parallel Query: +----+-------------+----------+------+-------------------------------+------+---------+------+-----------+--------------------------------------------------------------------------------------------------------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+----------+------+-------------------------------+------+---------+------+-----------+--------------------------------------------------------------------------------------------------------------------------------+ | 1 | SIMPLE | customer | ALL | PRIMARY | NULL | NULL | NULL | 14354602 | Using where; Using temporary; Using filesort | | 1 | SIMPLE | orders | ALL | PRIMARY,o_custkey,o_orderdate | NULL | NULL | NULL | 154545408 | Using where; Using join buffer (Hash Join Outer table orders); Using parallel query (4 columns, 1 filters, 1 exprs; 0 extra) | | 1 | SIMPLE | lineitem | ALL | PRIMARY,l_shipdate | NULL | NULL | NULL | 606119300 | Using where; Using join buffer (Hash Join Outer table lineitem); Using parallel query (4 columns, 1 filters, 1 exprs; 0 extra) | +----+-------------+----------+------+-------------------------------+------+---------+------+-----------+--------------------------------------------------------------------------------------------------------------------------------+ 3 rows in set (0.01 sec) Here is the relevant part of the Extras column: Using parallel query (4 columns, 1 filters, 1 exprs; 0 extra) The query runs in less than 2 minutes when Parallel Query is used: +------------+-------------+-------------+----------------+ | l_orderkey | revenue | o_orderdate | o_shippriority | +------------+-------------+-------------+----------------+ | 92511430 | 514726.4896 | 1995-03-06 | 0 | | 593851010 | 475390.6058 | 1994-12-21 | 0 | | 188390981 | 458617.4703 | 1995-03-11 | 0 | | 241099140 | 457910.6038 | 1995-03-12 | 0 | | 520521156 | 457157.6905 | 1995-03-07 | 0 | | 160196293 | 456996.1155 | 1995-02-13 | 0 | | 324814597 | 456802.9011 | 1995-03-12 | 0 | | 81011334 | 455300.0146 | 1995-03-07 | 0 | | 88281862 | 454961.1142 | 1995-03-03 | 0 | | 28840519 | 454748.2485 | 1995-03-08 | 0 | | 113920609 | 453897.2223 | 1995-02-06 | 0 | | 377389669 | 453438.2989 | 1995-03-07 | 0 | | 367200517 | 453067.7130 | 1995-02-26 | 0 | | 232404000 | 452010.6506 | 1995-03-08 | 0 | | 16384100 | 450935.1906 | 1995-03-02 | 0 | +------------+-------------+-------------+----------------+ 15 rows in set (1 min 53.36 sec) I can disable Parallel Query for the session (I can use an RDS custom cluster parameter group for a longer-lasting effect): set SESSION aurora_pq=OFF; The query runs considerably slower without it: +------------+-------------+-------------+----------------+ | l_orderkey | o_orderdate | revenue | o_shippriority | +------------+-------------+-------------+----------------+ | 92511430 | 1995-03-06 | 514726.4896 | 0 | ... | 16384100 | 1995-03-02 | 450935.1906 | 0 | +------------+-------------+-------------+----------------+ 15 rows in set (1 hour 25 min 51.89 sec) This was on a db.r4.2xlarge instance; other instance sizes, data sets, access patterns, and queries will perform differently. I can also override the query optimizer and insist on the use of Parallel Query for testing purposes: set SESSION aurora_pq_force=ON; Things to Know Here are a couple of things to keep in mind when you start to explore Amazon Aurora Parallel Query: Engine Support – We are launching with support for MySQL 5.6, and are working on support for MySQL 5.7 and PostgreSQL. Table Formats – The table row format must be COMPACT; partitioned tables are not supported. Data Types – The TEXT, BLOB, and GEOMETRY data types are not supported. DDL – The table cannot have any pending fast online DDL operations. Cost – You can make use of Parallel Query at no extra charge. However, because it makes direct access to storage, there is a possibility that your IO cost will increase. Give it a Shot This feature is available now and you can start using it today! — Jeff;  

Renaming Proxy Subdomains to Service Domains

cPanel Blog -

In cPanel & WHM version 76, which we expect to be in EDGE this week, we renamed “Proxy Subdomains” to “Service Subdomains” due to improvements we are making under the hood. Let’s talk about where they came from, and why we’re changing their name! What are Proxy Subdomains? Proxy subdomains allow users to connect indirectly to the cPanel & WHM login pages. Rather than opening example.com:2083, they can open cpanel.example.com. Proxy subdomains have two primary uses for hosting providers …

Building Community and Celebrating Women in Tech: We’re in it together

LinkedIn Official Blog -

We've seen a shift in what the traditional notion of success looks like. For many, it's not defined by a title, promotion or securing the corner office. The definition of success is expanding to include things like building a great team, establishing a side hustle or finding the perfect work/life balance. As a way to determine what success means to you, we’re asking our members, “What are you in it for?” For me, I’ve been fortunate enough to have mentors play an important role in pushing me... .

5 Steps to a Successful Website Migration

Nexcess Blog -

Website migrations can be scary, but they don’t have to be. Here are 5 steps for making your moving experience as seamless as possible; starting from knowing what you need to back-up, and finishing with full DNS propagation and your new hosting solution going live. It’s not every day you decide to change hosting providers… Continue reading →

cPanel, the Hosting Platform of Choice, is Excited to Announce the Release of cPanel & WHM Version 74

My Host News -

HOUSTON, TX – cPanel & WHM® Version 74 is the third release of 2018 and includes improvements in many of cPanel & WHM’s most popular features. Git Version Control allows users to easily create and manage Git repositories on their cPanel hosting account. Version 74 now includes the ability to automatically deploy sites and applications from a repository, making it even easier to host applications on cPanel servers. The development team has been hard at work over the last year to improve the speed and quality of cPanel’s Automated Backups. cPanel users benefit from up to 60 percent faster backups and restores. The File and Directory Restore interface for server administrators and hosting users, along with new and improved remote destinations, make cPanel’s backup system one of the most robust on the market. AutoSSL and SSLs ordered through the cPanel Market Place will see a much higher success rate with this version. cPanel has improved its Domain Control Validation (DCV). DNS-Based validation and Ancestor-based (or primary-domain) validation reduce the work required of a user to get an SSL. “cPanel & WHM Version 74 shows what can happen when you stay focused on what your customers need,” said Ken Power, Vice President of Product Development. “Across-the-board performance improvement, features focused on the evolving hosting market and tools that make server management a breeze. These changes and more make cPanel & WHM Version 74 our best release to date.” As server administrators move to adopt cPanel & WHM Version 74, cPanel will see nearly 80 percent adoption across all cPanel & WHM servers. Take a look at the feature highlights on the cPanel & WHM Version 74 Release Site and read the full Version 74 release notes in the cPanel & WHM Documentation. About cPanel Since 1997, cPanel has been delivering the web hosting industry’s most reliable and intuitive web hosting automation software. This robust automation software helps businesses thrive and allows them to focus on more important things. Customers and partners receive first-class support and a rich feature set, making cPanel & WHM the Hosting Platform of Choice. For more information about cPanel, visit https://cpanel.com.

Colocation Northwest Continues Growth of Enterprise Facilities with Bellevue Washington Data Center Expansion to One Megawatt

My Host News -

Seattle, WA – Colocation Northwest, a division of IsoFusion, a leading provider of Internet access and IT services in the Puget Sound region, announced today it completed upgrades to its Bellevue Washington data center to allow for one megawatt of service capacity. The data center, located in Bellevue Washington’s Eastgate neighborhood, completed a four-fold capacity increase from 250 kilowatts to one megawatt. The expansion included a state-of-the-art data center wing, immediately occupied by a global top-5 video game publisher and distributor. The data center features state-of-the-art power, cooling, fire suppression and security. To comply with the City of Bellevue’s sound impact concerns, Colocation Northwest installed Liebert DSE zero-water cooling units, MC Outdoor Condensers and Econophase pumped refrigerant economizers. Not only is this system whisper quiet, it significantly improves data center efficiencies. “Our Bellevue colocation facility is open to any business requiring safe and reliable lease space for its IT equipment”, said Stephen Milton, Colocation Northwest CEO. “The data center expansion offers Eastside businesses a previously unavailable combination of computing power and proximity.” The Bellevue data center supplies Tier I connectivity, the highest level of data and internet communications via Colocation Northwest’s redundant fiber network. Colocation Northwest operates a proprietary 180Gbps dark fiber loop that circles Lake Washington and interconnects all of the company’s data centers with additional network fiber routes to Colocation Northwest’s California locations and transpacific network access points to Asia and Pacific ring networks. Users can use this fiber loop to access more than 250 carriers and service providers, including high-speed international access. The Colocation Northwest Bellevue data center offers powerful computing for Eastside businesses in their own community with the option for systems management and connectivity to Colocation Northwest’s west coast redundant network of data center locations. Coupled with its expansion last year into the Centeris South Hill data center campus, Colocation Northwest continues to strengthen its ability to service enterprise and high-density customers with the most reliable, scalable, fully managed colocation solutions and facilities on the west coast. About Colocation Northwest and IsoFusion: Colocation Northwest, a division of IsoFusion Inc. is one of the largest privately held ISP and Colocation providers in Western Washington. Founded in 1991 as ISOMEDIA, IsoFusion is a full solution provider of Internet related products and services that cater to businesses, as well as providing complex solutions to companies with a national presence. A competitive local exchange carrier (CLEC) in the state of Washington, IsoFusion offers a full range of services providing everything from commercial Fiber and Ethernet connections, custom Fiber to the Home (FTTH) community solutions, cloud, hosting and dedicated server options, managed data center colocation services and technology consulting for businesses. IsoFusion provides managed enterprise solutions and outstanding service to prominent Northwest businesses. IsoFusion is Headquartered in Seattle Washington and provides exceptional quality, value, and service to over 23,000 residential and business customers across the We st Coast. For more information visit http://www.colocationnorthwest.com or http://www.isofusion.com.

HostColor.com Launches Managed Website Hosting Service

My Host News -

South Bend, IN – HostColor.com (HC), a web hosting provider since 2000 has recently launched a new service named Managed Website which includes technical and system administration. The difference between regular self-managed Web Hosting and the Managed Website is, that the first one is a do-it-yourself service. The owners of a self-managed web hosting accounts usually receive login credentials for a their control panel and account management system. The controls panel is used for configuration and automation of various services such as http/https, email, antispasmodic, antivirus, databases, ftp, storage space and others, delivered from a shared server environment. What is Managed Website? Managed Website is an IT hosting service that uses the same technology environment as the self-managed web hosting. There is one major difference, however. The Managed Website comes with a technical and system administration. This means that the website owners no longer need to learn how control panels and all other website services work, in order to be able to configure them to their custom settings. In other words, the Managed Website makes possible for the individuals and small business owners to save a lot of time by outsourcing all technical workloads to Host Color. All HC Managed Website Hosting plans feature 1 hour of technical and system administration by default. The website owners can extend the managed service on demand. “The average site owner who manages a website based on the most popular LAMP (Linux, Apache, MySQWL, PHP) server environment, usually spends between 8 and 16 hours per month on technical administration. Those are numbers which we have received from a survey, we have conducted among our clients who use the Self-Managed Web Hosting services”, says Dani Stoychev an Operations Manager at HostColor.com. He has also added that a lot of people who spend more than 8 hours per month working with the cPanel control panel and on the backend of their websites, usually like doing that and this is the main reason for them to deal with a technical administration. “Still, there are many small business owners who simply do not calculate the time spend on managing their websites and hosting accounts. At the end of the month many of them usually find that they have lost 10 hours on average on dealing with technical issues, instead of spending more time on developing and growing their business”, adds the HC Operations Manager. The main reason for Host Color to announce the Managed Web Hosting service is that website technology frameworks have become much more complex and people no longer deal with a handful of technologies such as HTML, CSS, Javascript, PHP and few others. Most content management systems (CMS) such as WordPress, Drupal and Joomla have grown into whole ecosystems. In order to manage properly any live versions of websites, anyone needs to have a specific operational knowledge and experience on server-side installations. Those who create and manage websites need a complex knowledge on Linux OS, web servers, databases, various Open Source frameworks, hosting automation control panels, CMS and various platforms and applications. Even the web designers and digital agencies who usually specialize in building websites on certain CMS, aren’t usually familiar with the server-side workloads, website maintenance and troubleshooting. “There are 3 major reasons the website management work to be shifted to the infrastructure providers”, says Dimitar Avramov, the CEO of HC. “Web hosting companies are open 24/7; Their Support teams have the knowledge to manage live versions of the websites; Managed web hosting providers charge less for maintenance than the digital agencies.” HC has listed five different Managed Website plans on tis website. Each of them features different amount of computing resources. However, all of them come with “1 hour of Managed Service” per month. The company’s customers can add as many hours of technical and system administration as they need by paying a reasonably low per-hour fee. There are also monthly Managed service packages available at special pricing for commitment of 4 hours or above. The Managed Website includes a Domain Validated SSL certificate, a dedicated IP address, a certain number of hosted websites, an enterprise-grade SSD storage with build-in data protection, an uptime monitoring and most importantly an onsite SEO service (website and content analysis). The HC customers have the flexibility to use and subscribe for the Managed Website hosting service on monthly basis. About Host Color Host Color- AS46873 – is a web hosting provider since 2000. It operates a fully-redundant, 100% uptime network and peers to more than 70 quality Internet networks and ISPs. The company is also a Cloud Computing service provider and delivers Public, Private and Hybrid Cloud servers from various data centers across North America. Its main data center is based out of South Bend, Indiana, 90 miles from Chicago. HC also provides disaster recovery, Colocation and Dedicated Hosting in Europe through Host Color Europe.

ServerCentral Announces PCI DSS 3.2.1 Compliance

My Host News -

CHICAGO, IL – ServerCentral, a leading provider of managed IT infrastructure solutions, today announced its VMware-powered Enterprise Cloud has received compliance certification with the Payment Card Industry Data Security Standard (PCI DSS). “When there’s a chance to improve security, we want everyone to have it — no extra fees, no migration. If you work with ServerCentral, you get the best protection we can offer as soon as we can offer it,” said Tom Kiblin, Vice President of Managed Services at ServerCentral. “PCI DSS certification — the industry gold-standard for securely processing credit cards — is no different.” ServerCentral’s annual AT-101 SOC 2 Type II audit serves as a foundation for reaching the latest PCI designation (version 3.2.1) and for helping customers meet their own compliance requirements: “We really didn’t have to change a lot, which is a testament to how seriously we take security,” said Joe Johnson, IT Director and Compliance Officer at ServerCentral. “We were already following the proper protocols; we simply needed to organize our existing processes in the manner required for the PCI Security Standards Council.” ServerCentral services that are also in scope for PCI DSS compliance include Colocation and Disaster Recovery. ServerCentral is constantly working to expand PCI DSS coverage across its Private Cloud, Hybrid Cloud, and Dedicated Server services. If your application requires PCI DSS certification or you are looking for help with your PCI DSS compliance requirements, visit https://www.servercentral.com/compliance to see how ServerCentral can help. ServerCentral’s Enterprise Cloud is architected for applications with premium performance requirements. To learn more, visit https://www.servercentral.com/enterprise-cloud/ and https://www.servercentral.com/compliance/. About ServerCentral ServerCentral is a managed IT infrastructure solutions provider. Since 2000, leading technology, finance, healthcare, and e-commerce firms have chosen ServerCentral to design and manage their mission-critical IT infrastructure in data centers across North America, Europe, and Asia. Whether it’s colocation, managed services, or a cloud, ServerCentral designs the optimal solution for each client. About the PCI Security Standards Council The PCI Security Standards Council (PCI SSC) leads a global, cross-industry effort to increase payment security by providing industry-driven, flexible and effective data security standards and programs that help businesses detect, mitigate and prevent cyberattacks and breaches. Connect with the PCI SSC on LinkedIn.

How to Find and Hire the Right WordPress Theme Developer

InMotion Hosting Blog -

Standing out from your competitors with your website can help bring authority and credibility to your brand. If your WordPress website has been online for a little while, you may want to consider looking into a custom WordPress theme. With most premium WordPress themes, no matter how many images and colors you change, you can never remove all theme styling and layout. We’re going to discuss the process of finding the right custom theme developer to help you develop a professional custom WordPress theme. Continue reading How to Find and Hire the Right WordPress Theme Developer at The Official InMotion Hosting Blog.

Harnessing Data and Analytics to Personalize Customer Experiences

The Rackspace Blog & Newsroom -

Most financial services and retail companies spend a lot of time and money creating rich customer stories and personas to improve marketing and sales. These engineered customer stories help target general markets and improve offerings on a broad scale. You probably also spend much of your strategic IT budget on user experience applications, enhancing mobile […] The post Harnessing Data and Analytics to Personalize Customer Experiences appeared first on The Official Rackspace Blog.

Introducing the Cloudflare Onion Service

CloudFlare Blog -

When: a cold San Francisco summer afternoonWhere: Room 305, CloudflareWho: 2 from Cloudflare + 9 from the Tor Project What could go wrong?Bit of BackgroundTwo years ago this week Cloudflare introduced Opportunistic Encryption, a feature that provided additional security and performance benefits to websites that had not yet moved to HTTPS. Indeed, back in the old days some websites only used HTTP --- weird, right? “Opportunistic” here meant that the server advertised support for HTTP/2 via an HTTP Alternative Service header in the hopes that any browser that recognized the protocol could take advantage of those benefits in subsequent requests to that domain. Around the same time, CEO Matthew Prince wrote about the importance and challenges of privacy on the Internet and tasked us to find a solution that provides convenience, security, and anonymity. From neutralizing fingerprinting vectors and everyday browser trackers that Privacy Badger feeds on, all the way to mitigating correlation attacks that only big actors are capable of, guaranteeing privacy is a complicated challenge. Fortunately, the Tor Project addresses this extensive adversary model in Tor Browser. However, the Internet is full of bad actors, and distinguishing legitimate traffic from malicious traffic, which is one of Cloudflare’s core features, becomes much more difficult when the traffic is anonymous. In particular, many features that make Tor a great tool for privacy also make it a tool for hiding the source of malicious traffic. That is why many resort to using CAPTCHA challenges to make it more expensive to be a bot on the Tor network. There is, however, a collateral damage associated with using CAPTCHA challenges to stop bots: humans eyes also have to deal with them. One way to minimize this is using privacy-preserving cryptographic signatures, aka blinded tokens, such as those that power Privacy Pass. The other way is to use onions. Here Come the OnionsToday’s edition of the Crypto Week introduces an “opportunistic” solution to this problem, so that under suitable conditions, anyone using Tor Browser 8.0 will benefit from improved security and performance when visiting Cloudflare websites without having to face a CAPTCHA. At the same time, this feature enables more fine-grained rate-limiting to prevent malicious traffic, and since the mechanics of the idea described here are not specific to Cloudflare, anyone can reuse this method on their own website.Before we continue, if you need a refresher on what Tor is or why we are talking about onions, check out the Tor Project website or our own blog post on the DNS resolver onion from June.As Matthew mentioned in his blog post, one way to sift through Tor traffic is to use the onion service protocol. Onion services are Tor nodes that advertise their public key, encoded as an address with .onion TLD, and use “rendezvous points” to establish connections entirely within the Tor network: While onion services are designed to provide anonymity for content providers, media organizations use them to allow whistleblowers to communicate securely with them and Facebook uses one to tell Tor users from bots.The technical reason why this works is that from an onion service’s point of view each individual Tor connection, or circuit, has a unique but ephemeral number associated to it, while from a normal server’s point of view all Tor requests made via one exit node share the same IP address. Using this circuit number, onion services can distinguish individual circuits and terminate those that seem to behave maliciously. To clarify, this does not mean that onion services can identify or track Tor users.While bad actors can still establish a fresh circuit by repeating the rendezvous protocol, doing so involves a cryptographic key exchange that costs time and computation. Think of this like a cryptographic dial-up sequence. Spammers can dial our onion service over and over, but every time they have to repeat the key exchange.Alternatively, finishing the rendezvous protocol can be thought of as a small proof of work required in order to use the Cloudflare Onion Service. This increases the cost of using our onion service for performing denial of service attacks.Problem solved, right?Not quite. As discussed when we introduced the hidden resolver, the problem of ensuring that a seemingly random .onion address is correct is a barrier to usable security. In that case, our solution was to purchase an Extended Validation (EV) certificate, which costs considerably more. Needless to say, this limits who can buy an HTTPS certificate for their onion service to a privileged few. Some people disagree. In particular, the new generation of onion services resolves the weakness that Matthew pointed to as a possible reason why the CA/B Forum only permits EV certificates for onion services. This could mean that getting Domain Validation (DV) certificates for onion services could be possible soon. We certainly hope that’s the case. Still, DV certificates lack the organization name (e.g. “Cloudflare, Inc.”) that appears in the address bar, and cryptographically relevant numbers are nearly impossible to remember or distinguish for humans. This brings us back to the problem of usable security, so we came up with a different idea.Looking at onion addresses differentlyForget for a moment that we’re discussing anonymity. When you type “cloudflare.com” in a browser and press enter, your device first resolves that domain name into an IP address, then your browser asks the server for a certificate valid for “cloudflare.com” and attempts to establish an encrypted connection with the host. As long as the certificate is trusted by a certificate authority, there’s no reason to mind the IP address.Roughly speaking, the idea here is to simply switch the IP address in the scenario above with an .onion address. As long as the certificate is valid, the .onion address itself need not be manually entered by a user or even be memorable. Indeed, the fact that the certificate was valid indicates that the .onion address was correct.In particular, in the same way that a single IP address can serve millions of domains, a single .onion address should be able to serve any number of domains.Except, DNS doesn’t work this way.How does it work then?Just as with Opportunistic Encryption, we can point users to the Cloudflare Onion Service using HTTP Alternative Services, a mechanism that allows servers to tell clients that the service they are accessing is available at another network location or over another protocol. For instance, when Tor Browser makes a request to “cloudflare.com,” Cloudflare adds an Alternative Service header to indicate that the site is available to access over HTTP/2 via our onion services.In the same sense that Cloudflare owns the IP addresses that serve our customers’ websites, we run 10 .onion addresses. Think of them as 10 Cloudflare points of presence (or PoPs) within the Tor network. The exact header looks something like this, except with all 10 .onion addresses included, each starting with the prefix “cflare”:alt-svc: h2="cflare2nge4h4yqr3574crrd7k66lil3torzbisz6uciyuzqc2h2ykyd.onion:443"; ma=86400; persist=1 This simply indicates that the “cloudflare.com” can be authoritatively accessed using HTTP/2 (“h2”) via the onion service “cflare2n[...].onion”, over virtual port 443. The field “ma” (max-age) indicates how long in seconds the client should remember the existence of the alternative service and “persist” indicates whether alternative service cache should be cleared when the network is interrupted.Once the browser receives this header, it attempts to make a new Tor circuit to the onion service advertised in the alt-svc header and confirm that the server listening on virtual port 443 can present a valid certificate for “cloudflare.com” — that is, the original hostname, not the .onion address.The onion service then relays the Client Hello packet to a local server which can serve a certificate for “cloudflare.com.” This way the Tor daemon itself can be very minimal. Here is a sample configuration file:SocksPort 0 HiddenServiceNonAnonymousMode 1 HiddenServiceSingleHopMode 1 HiddenServiceVersion 3 HiddenServicePort 443 SafeLogging 1 Log notice stdout Be careful with using the configuration above, as it enables a non-anonymous setting for onion services that do not require anonymity for themselves. To clarify, this does not sacrifice privacy or anonymity of Tor users, just the server. Plus, it improves latency of the circuits.If the certificate is signed by a trusted certificate authority, for any subsequent requests to “cloudflare.com” the browser will connect using HTTP/2 via the onion service, sidestepping the need for going through an exit node.Here are the steps summarized one more time:A new Tor circuit is established;The browser sends a Client Hello to the onion service with SNI=cloudflare.com;The onion service relays the packet to a local server;The server replies with Server Hello for SNI=cloudflare.com;The onion service relays the packet to the browser;The browser verifies that the certificate is valid.To reiterate, the certificate presented by the onion service only needs to be valid for the original hostname, meaning that the onion address need not be mentioned anywhere on the certificate. This is a huge benefit, because it allows you to, for instance, present a free Let’s Encrypt certificate for your .org domain rather than an expensive EV certificate.Convenience, ✓Distinguishing the CircuitsRemember that while one exit node can serve many many different clients, from Cloudflare’s point of view all of that traffic comes from one IP address. This pooling helps cover the malicious traffic among legitimate traffic, but isn’t essential in the security or privacy of Tor. In fact, it can potentially hurt users by exposing their traffic to bad exit nodes.Remember that Tor circuits to onion services carry a circuit number which we can use to rate-limit the circuit. Now, the question is how to inform a server such as nginx of this number with minimal effort. As it turns out, with only a small tweak in the Tor binary, we can insert a Proxy Protocol header in the beginning of each packet that is forwarded to the server. This protocol is designed to help TCP proxies pass on parameters that can be lost in translation, such as source and destination IP addresses, and is already supported by nginx, Apache, Caddy, etc.Luckily for us, the IPv6 space is so vast that we can encode the Tor circuit number as an IP address in an unused range and use the Proxy Protocol to send it to the server. Here is an example of the header that our Tor daemon would insert in the connection:PROXY TCP6 2405:8100:8000:6366:1234:ABCD ::1 43981 443\r\n In this case, 0x1234ABCD encodes the circuit number in the last 32 bits of the source IP address. The local Cloudflare server can then transparently use that IP to assign reputation, show CAPTCHAs, or block requests when needed.Note that even though requests relayed by an onion service don’t carry an IP address, you will see an IP address like the one above with country code “T1” in your logs. This IP only specifies the circuit number seen by the onion service, not the actual user IP address. In fact, 2405:8100:8000::/48 is an unused subnet allocated to Cloudflare that we are not routing globally for this purpose.This enables customers to continue detecting bots using IP reputation while sparing humans the trouble of clicking on CAPTCHA street signs over and over again.Security, ✓Why should I trust Cloudflare?You don’t need to. The Cloudflare Onion Service presents the exact same certificate that we would have used for direct requests to our servers, so you could audit this service using Certificate Transparency (which includes Nimbus, our certificate transparency log), to reveal any potential cheating.Additionally, since Tor Browser 8.0 makes a new circuit for each hostname when connecting via an .onion alternative service, the circuit number cannot be used to link connections to two different sites together.Note that all of this works without running any entry, relay, or exit nodes. Therefore the only requests that we see as a result of this feature are the requests that were headed for us anyway. In particular, since no new traffic is introduced, Cloudflare does not gain any more information about what people do on the internet.Anonymity, ✓Is it faster?Tor isn’t known for being fast. One reason for that is the physical cost of having packets bounce around in a decentralized network. Connections made through the Cloudflare Onion Service don’t add to this cost because the number of hops is no more than usual.Another reason is the bandwidth costs of exit node operators. This is an area that we hope this service can offer relief since it shifts traffic from exit nodes to our own servers, reducing exit node operation costs along with it.BONUS: Performance, ✓How do I enable it?Onion Routing is now available to all Cloudflare customers, enabled by default for Free and Pro plans. The option is available in the Crypto tab of the Cloudflare dashboard.Browser supportWe recommend using Tor Browser 8.0, which is the first stable release based on Firefox 60 ESR, and supports .onion Alt-Svc headers as well as HTTP/2. The new Tor Browser for Android (alpha) also supports this feature. You can check whether your connection is routed through an onion service or not in the Developer Tools window under the Network tab. If you're using the Tor Browser and you don't see the Alt-Svc in the response headers, that means you're already using the .onion route. In future versions of Tor Browser you'll be able to see this in the UI.We've got BIG NEWS. We gave Tor Browser a UX overhaul. Tor Browser 8.0 has a new user onboarding experience, an updated landing page, additional language support, and new behaviors for bridge fetching, displaying a circuit, and visiting .onion sites.https://t.co/fpCpSTXT2L pic.twitter.com/xbj9lKTApP— The Tor Project (@torproject) September 5, 2018 There is also interest from other privacy-conscious browser vendors. Tom Lowenthal, Product Manager for Privacy & Security at Brave said:Automatic upgrades to `.onion` sites will provide another layer of safety to Brave’s Private Browsing with Tor. We’re excited to implement this emerging standard.Any last words?Similar to Opportunistic Encryption, Opportunistic Onions do not fully protect against attackers who can simply remove the alternative service header. Therefore it is important to use HTTPS Everywhere to secure the first request. Once a Tor circuit is established, subsequent requests should stay in the Tor network from source to destination.As we maintain and improve this service we will share what we learn. In the meanwhile, feel free to try out this idea on Caddy and reach out to us with any comments or suggestions that you might have.AcknowledgmentsPatrick McManus of Mozilla for enabling support for .onion alternative services in Firefox; Arthur Edelstein of the Tor Project for reviewing and enabling HTTP/2 and HTTP Alternative Services in Tor Browser 8.0; Alexander Færøy and George Kadianakis of the Tor Project for adding support for Proxy Protocol in onion services; the entire Tor Project team for their invaluable assistance and discussions; and last, but not least, many folks at Cloudflare who helped with this project.Addresses used by the Cloudflare Onion Servicecflarexljc3rw355ysrkrzwapozws6nre6xsy3n4yrj7taye3uiby3ad.onion cflarenuttlfuyn7imozr4atzvfbiw3ezgbdjdldmdx7srterayaozid.onion cflares35lvdlczhy3r6qbza5jjxbcplzvdveabhf7bsp7y4nzmn67yd.onion cflareusni3s7vwhq2f7gc4opsik7aa4t2ajedhzr42ez6uajaywh3qd.onion cflareki4v3lh674hq55k3n7xd4ibkwx3pnw67rr3gkpsonjmxbktxyd.onion cflarejlah424meosswvaeqzb54rtdetr4xva6mq2bm2hfcx5isaglid.onion cflaresuje2rb7w2u3w43pn4luxdi6o7oatv6r2zrfb5xvsugj35d2qd.onion cflareer7qekzp3zeyqvcfktxfrmncse4ilc7trbf6bp6yzdabxuload.onion cflareub6dtu7nvs3kqmoigcjdwap2azrkx5zohb2yk7gqjkwoyotwqd.onion cflare2nge4h4yqr3574crrd7k66lil3torzbisz6uciyuzqc2h2ykyd.onion Subscribe to the blog for daily updates on our announcements.

Social Media Image Sizes for 2018: A Guide for Marketers

Social Media Examiner -

Wondering how to optimize your marketing images for different social media platforms? Looking for a guide to help you make sure your image dimensions are correct? In this article, you’ll discover a guide to the optimal image sizes for nine of the top social media networks. #1: Facebook Image Sizes Facebook is a starting place [...] The post Social Media Image Sizes for 2018: A Guide for Marketers appeared first on Social Media Examiner.

Pages

Recommended Content

Subscribe to Complete Hosting Guide aggregator