Consulting – Chris Gilligan » new media https://chrisgilligan.com portfolio of web work Wed, 10 Jul 2019 22:12:16 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.1 University Website Redesign: Scrapbook https://chrisgilligan.com/drupal/university-website-redesign-scrapbook/ https://chrisgilligan.com/drupal/university-website-redesign-scrapbook/#respond Wed, 19 Jul 2017 12:51:27 +0000 https://chrisgilligan.com/?p=2444 Here are some websites and resources that may be useful for a redesign of UTC.edu. Higher Education sites College of Southern Nevada Case Study: Acquia | College of Southern Nevada | Acquia Engage Award 2016 Finalist 68 percent increase in page views 60 percent decrease in bounce rate 90 percent increase in mobile visitors Austin Peay State […]

The post University Website Redesign: Scrapbook appeared first on Chris Gilligan » new media.

]]>

Here are some websites and resources that may be useful for a redesign of UTC.edu.

Higher Education sites

College of Southern Nevada

Case Study:
Acquia | College of Southern Nevada | Acquia Engage Award 2016 Finalist

  • 68 percent increase in page views
  • 60 percent decrease in bounce rate
  • 90 percent increase in mobile visitors

CSN.edu

Austin Peay State University

This is a newer design that is built on OU Campus CMS.

  • Very clean design, mobile first.
  • Prominent but not obtrusive Apply/Visit/Give sub-menu on home page.
  • Great organization to introduce Admissions up front: Explore Austin Peay.
  • Student profile cards: diversity of people; could use diversity of disciplines.
  • Visual index of major colleges with department landing pages.
  • Visual calendar layout for upcoming featured events/dates/deadlines.

APSU.edu

UT Knoxville

One of the most innovative and fresh website designs in higher education.

Home page & Vol Stories built with Expression Engine CMS; interior pages with WordPress CMS, and other portions such as maps are custom-built.

  • Fresh and simple, very fast page load; only 1 large image.
  • Featured Vol Story changes frequently.
  • Menu is main part of the page. Most frequently used links are grouped with logo at very top left.
  • Separate landing page for news.

UTK.edu

South Dakota State University

Mobile-first, recruitment-first. Close to our color palette. Drupal 8.

SDstate.edu

Florida International University

Very close to UTC color palette. Creative use of photography and marketing in the image carousel. Proper use of HTML titles vs. text-on-image. Intersting, simple calendar “teaser”.

FIU.edu

Chicago Booth Review

Drupal 8 CMS.

Excellent news magazine style. Featured/Cover article, popular articles, latest articles, video.

Review.ChicagoBooth.edu

Corporate sites

The Guardian

Ultra-responsive and fast loading news site with excellent use of color.

  • large custom serif type, readable on all devices
  • multiple versions of headlines, subheads and excerpts for home page, indices, and stories – very readable and scannable

The Guardian

First Tennessee Bank

Features a “task-oriented” home page.“I want to… do X” actions, “I’m looking for… Y” interactives & animated menus.

There are multiple pathways to common tasks:

  • prominent top-level menus with icons and action words
  • call-to action panels for other common tasks or features
  • common pathways have several routes
    • also enables A/B testing to see which method or path is most effective

FTB.com

National Trust for Historic Preservation

Beautiful site with video backgrounds, clean-yet-interesting design with some slanted lines to add interest.

  • Scalable Vector Graphics (SVG images) for headlines.
  • Extensive and beautiful photography.
  • Simple interactivity on wide hover in the CTAs, blog article cards, etc.

SavingPlaces.org

United States Postal Service

Action-oriented interface: Track Package, Create Label, Buy Stamps, Order Boxes.

  • Intersting use of slanted elements that match the logo geometry.
  • Basic white background lets the corporate colors stand out.
  • “Blades” Section in the middle of the page gives a more visual & interactive option for CTAs.

USPS.com

Chevrolet

Chevy.com, like many auto sites, has rich mega-menu content.

  • Clean white palette, which gives photography and brand-color highlights a chance to stand out.
  • Scalable Vector Graphic (SVG) icons and display text are accessible and look great on all devices.

Chevy.com

Sucuri

Long-scroll page, heavy use of icons.

  • Strong use of a very limited color palette.
  • CTAs right in the top menu, then repeated throughout page.

Sucuri.net

Drupal

Simple and clean design with a hero video background.

Drupal.com

Mercury Marine

Good visual menus: icon top menus, image-filled megamenus.

  • Correct use of slider/carousel with delayed titles and CTAs.

Mercury Marine

 AirServer

Cute animations & graphics with a consistent style.

Menu: desktop effect on top menu after scroll; pinned/fixed menu on scroll; mobile menu interactive animations.

Air Server

Creative Sites

Macau Design Biennial

Long-scroll with definite color cues between sections.

  • Does not look like a website… looks like print.
  • Very creative interactions  invite exploration.

Macau Design Biennial

gskinner

Fast loading site with a long-scroll, landing page feel and multiple calls to action. Nice scrolling behavior for the Calls to Action (CTAs).

gskinner.com

ustwo

Clean and simple logo + teaser video engages the viewer, shows off products, staff, users, services… makes you want to click or scroll down to learn more.

No menu or layout until click or scroll, then more user interface elements, navigation appear.

ustwo.com

HTML5up: Solid State

Super clean responsive landing page designs from a Nashville designer, see others at html5up.net.

HTML5up

Palantir

Drupal agency… design, use of colors, CTAs with interesting layout, interactivity/rollover effects, forms, etc. Very clean and mobile-first design.

Palantir.net

Drupal 8 Premium Themes

These provide many more user interface elements than the base Bootstrap Drupal 8 Theme. Inexpensive jump-start for a project; typical price is $40-60.


Resources:

The post University Website Redesign: Scrapbook appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/drupal/university-website-redesign-scrapbook/feed/ 0
YouTube Gallery snippet & table transform for OU Campus https://chrisgilligan.com/consulting/youtube-gallery-snippet-table-transform-ou-campus/ https://chrisgilligan.com/consulting/youtube-gallery-snippet-table-transform-ou-campus/#respond Wed, 02 Mar 2016 18:20:55 +0000 https://chrisgilligan.com/?p=2401 Here’s a neat way to present multiple YouTube videos in a responsive “gallery” in OU Campus. This example uses a mix of Bootstrap CSS and CSS specific to the gallery. Thanks to Wooster Web Design for the HTML/CSS/JS. With the YouTube Gallery snippet and XSL transform, you can create a video gallery of multiple YouTube […]

The post YouTube Gallery snippet & table transform for OU Campus appeared first on Chris Gilligan » new media.

]]>

Here’s a neat way to present multiple YouTube videos in a responsive “gallery” in OU Campus. This example uses a mix of Bootstrap CSS and CSS specific to the gallery. Thanks to Wooster Web Design for the HTML/CSS/JS.

With the YouTube Gallery snippet and XSL transform, you can create a video gallery of multiple YouTube videos, selectable by clicking thumbnail images.

Here’s the table transform snippet (add HTML to /_resources/ou/snippets, then create new item in Content > Snippets).

<table class="ou-youtube-gallery transform" style="width: 100%;">
	<caption>YouTube Gallery Table</caption>
	<thead>
		<tr>
			<th width="30%">Title</th>
			<th>YouTube ID (code after "watch?v=" e.g. G4RT-prDbIw in youtube.com/watch?v=G4RT-prDbIw or youtu.be/G4RT-prDbIw)</th>
		</tr>
	</thead>
	<tbody>
		<tr>
			<td>
			My YouTube Video
			</td>
			<td>
			G4RT-prDbIw
			</td>
		</tr>
		<tr>
			<td>
			Another YouTube Video
			</td>
			<td>
			o5ZymO9UgjM
			</td>
		</tr>
	</tbody>
</table>
<br />

Here’s the XSL template match; this belongs in an XSL file such as common.xsl or interior.xsl.

<!-- transform YouTube Gallery -->
<xsl:template match="table[contains(@class, 'ou-youtube-gallery')]">
	<xsl:variable name="id" select="position()" />
  	<style type="text/css">
		@import url(//www.yourdomain.edu/_resources/css/youtube-gallery.css);
  	</style>
	<script type="text/javascript" src="/_resources/js/youtube-gallery.js?x15669" />

	<div class="well">
		<!-- THE YOUTUBE PLAYER -->
		<div class="vid-container">
			<iframe id="vid_frame{$id}" frameborder="0" width="560" height="315">
				<xsl:attribute name="src">https://www.youtube.com/embed/<xsl:value-of select="./tbody/tr[1]/td[2]//text()"/>?rel=0&amp;showinfo=0&amp;autohide=1</xsl:attribute>
			</iframe>
		</div>

		<!-- THE PLAYLIST -->
		<div class="vid-list-container">
			<div class="vid-list">

				<xsl:for-each select="./tbody/tr">
					<div class="vid-item" onclick="document.getElementById('vid_frame{$id}').src='https://youtube.com/embed/{td[2]//text()}?autoplay=1&amp;rel=0&amp;showinfo=0&amp;autohide=1'">

						<div class="thumb"><img alt="video thumbnail" src="https://img.youtube.com/vi/{td[2]//text()}/0.jpg" /></div>

						<div class="desc"><xsl:value-of select="td[1]//text()" /></div>

					</div>
				</xsl:for-each>

			</div><!--/vid-list-->
		</div><!--/vid-container-->

		<!-- LEFT AND RIGHT ARROWS -->
		<div class="arrows">
			<div class="arrow-left"><i class="icon-chevron-left icon-large"><!--scroll left--></i></div>
			<div class="arrow-right"><i class="icon-chevron-right icon-large"><!--scroll right--></i></div>
		</div>

	</div>
</xsl:template>

And here are the .js for scrolling and .css for styling

//www.woosterwebdesign.com/responsive-youtube-player-with-playlist/

$(document).ready(function () {
		    $(".arrow-right").bind("click", function (event) {
		        event.preventDefault();
		        $(".vid-list-container").stop().animate({
		            scrollLeft: "+=336"
		        }, 750);
		    });
		    $(".arrow-left").bind("click", function (event) {
		        event.preventDefault();
		        $(".vid-list-container").stop().animate({
		            scrollLeft: "-=336"
		        }, 750);
		    });
		});
/* styles responsive YouTube Gallery with scrolling thumbnails */
/** /www.woosterwebdesign.com/responsive-youtube-player-with-playlist/ **/

/*  VIDEO PLAYER CONTAINER
############################### */
.vid-container {
	position: relative;
	padding-bottom: 52%;
	padding-top: 30px; 
	height: 0; 
}

.vid-container iframe,
.vid-container object,
.vid-container embed {
	position: absolute;
	top: 0;
	left: 0;
	width: 100%;
	height: 100%;
}


/*  VIDEOS PLAYLIST 
############################### */
.vid-list-container {
	width: 92%;
	overflow: hidden;
	margin-top: 20px;
	margin-left:4%;
	padding-bottom: 20px;
}

.vid-list {
	width: 3680px;
	position: relative;
	top:0;
	left: 0;
}

.vid-item {
	display: block;
	width: 148px;
	height: 148px;
	float: left;
	margin: 0;
	padding: 10px;
}

.thumb {
	/*position: relative;*/
	overflow:hidden;
	height: 84px;
}

.thumb img {
	width: 100%;
	position: relative;
	top: -13px;
}

.vid-item .desc {
	color: #21A1D2;
	font-size: 15px;
	margin-top:5px;
}

.vid-item:hover {
	background: #eee;
	cursor: pointer;
}

.arrows {
	position:relative;
	width: 100%;
}

.arrow-left {
	color: #fff;
	position: absolute;
	background: #777;
	padding: 15px;
	left: -25px;
	top: -130px;
	z-index: 99;
	cursor: pointer;
}

.arrow-right {
	color: #fff;
	position: absolute;
	background: #777;
	padding: 15px;
	right: -25px;
	top: -130px;
	z-index:100;
	cursor: pointer;
}

.arrow-left:hover {
	background: #CC181E;
}

.arrow-right:hover {
	background: #CC181E;
}


@media (max-width: 624px) {
	body {
		margin: 15px;
	}
	.caption {
		margin-top: 40px;
	}
	.vid-list-container {
		padding-bottom: 20px;
	}

	/* reposition left/right arrows */
	.arrows {
		position:relative;
		margin: 0 auto;
		width:96px;
	}
	.arrow-left {
		left: 0;
		top: -17px;
	}

	.arrow-right {
		right: 0;
		top: -17px;
	}
}

…and, finally, some instructions for use:

Click the Insert Snippet button, then choose category Video/Embedded Media > YouTube Gallery

This will insert a table in which you enter Title and YouTube ID for each video. This creates a gallery as shown below. To insert additional videos, click the cursor on a table row, then click the Table button, and choose Row > Insert Row Before (or after), then enter the additional title and ID.

 

The post YouTube Gallery snippet & table transform for OU Campus appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/youtube-gallery-snippet-table-transform-ou-campus/feed/ 0
Integrating Content from External Sources into OU Campus Using RSS, PHP, and JavaScript https://chrisgilligan.com/consulting/integrating-content-from-external-sources-into-ou-campus-using-rss-php-and-javascript/ https://chrisgilligan.com/consulting/integrating-content-from-external-sources-into-ou-campus-using-rss-php-and-javascript/#respond Tue, 23 Feb 2016 00:50:35 +0000 https://chrisgilligan.com/?p=2386 The web team of the University of Tennessee at Chattanooga uses PHP and RSS to syndicate blog content, news releases, and calendar events into their main website. PHP SimpleXML is used to parse the XML of the RSS feeds. We import a variety of feeds, from WordPress, from Master Calendar, and from other sites such as […]

The post Integrating Content from External Sources into OU Campus Using RSS, PHP, and JavaScript appeared first on Chris Gilligan » new media.

]]>

The web team of the University of Tennessee at Chattanooga uses PHP and RSS to syndicate blog content, news releases, and calendar events into their main website.

PHP SimpleXML is used to parse the XML of the RSS feeds. We import a variety of feeds, from WordPress, from Master Calendar, and from other sites such as an external athletics CMS. WordPress provides some RSS features, including RSS from categories, tags and search strings, but we have added media attachments and a customized template to output a more complicated RSS feed on the University home page.

News releases, events and content from WordPress via RSS

Example web pages:

WordPress plugin & PHP to add media attachments to RSS feed

Example RSS feeds (view XML/XSL styled page in FireFox; View Source to see XML structure, namespaces, node names and structure):

Custom WordPress RSS feed template, followed by the functions call to load and create it:

<?php
/*
Template Name: Custom Current Headlines Feed

*/

$numposts = 10;
$category_id = get_cat_ID('Current Headlines');

function custom_rss_date( $timestamp = null ) {
  $timestamp = ($timestamp==null) ? time() : $timestamp;
  echo date(DATE_RSS, $timestamp);
}

function custom_rss_text_limit($string, $length, $replacer = '&hellip;') { 
  $string = strip_tags($string);
  if(strlen($string) > $length) 
    return (preg_match('/^(.*)\W.*$/', substr($string, 0, $length+1), $matches) ? $matches[1] : substr($string, 0, $length)) . $replacer;   
  return $string; 
}

$posts = query_posts('cat='.$category_id.'&showposts='.$numposts);

$lastpost = $numposts - 1;



header('Content-Type: ' . feed_content_type('rss-http') . '; charset=' . get_option('blog_charset'), true);
$more = 1;

echo '<?xml version="1.0" encoding="'.get_option('blog_charset').'"?'.'>'; ?>

<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	<?php do_action('rss2_ns'); ?>
>

<channel>
	<title><?php bloginfo_rss('name'); wp_title_rss(); ?></title>
	<atom:link href="<?php self_link(); ?>" rel="self" type="application/rss+xml" />
	<link><?php bloginfo_rss('url') ?></link>
	<description><?php bloginfo_rss("description") ?></description>
	<lastBuildDate><?php echo mysql2date('D, d M Y H:i:s +0000', get_lastpostmodified('GMT'), false); ?></lastBuildDate>
	<language><?php bloginfo_rss( 'language' ); ?></language>
	<sy:updatePeriod><?php echo apply_filters( 'rss_update_period', 'hourly' ); ?></sy:updatePeriod>
	<sy:updateFrequency><?php echo apply_filters( 'rss_update_frequency', '1' ); ?></sy:updateFrequency>
	<?php do_action('rss2_head'); ?>
	<?php while( have_posts()) : the_post(); ?>
	<item>
		<title><?php the_title_rss() ?></title>
		<link><?php the_permalink_rss() ?></link>
		<comments><?php comments_link_feed(); ?></comments>
		<pubDate><?php echo mysql2date('D, d M Y H:i:s +0000', get_post_time('Y-m-d H:i:s', true), false); ?></pubDate>
		<dc:creator><?php the_author() ?></dc:creator>
<?php the_category_rss('rss2') ?>
		<guid isPermaLink="false"><?php the_guid(); ?></guid>
<?php if (get_option('rss_use_excerpt')) : ?>
		<description><![CDATA[<?php get_the_excerpt(); ?>]]></description>
<?php else : ?>
		<description><?php echo '<![CDATA['.custom_rss_text_limit($post->post_content, 256).']]>';  ?></description>
<?php $content = get_the_content_feed('rss2'); ?>
	<?php if ( strlen( $content ) > 0 ) : ?>
		<content:encoded><![CDATA[<?php echo $content; ?>]]></content:encoded>
	<?php else : ?>
		<content:encoded><![CDATA[<?php the_excerpt(); ?>]]></content:encoded>
	<?php endif; ?>
<?php endif; ?>
		<wfw:commentRss><?php echo esc_url( get_post_comments_feed_link(null, 'rss2') ); ?></wfw:commentRss>
		<slash:comments><?php echo get_comments_number(); ?></slash:comments>
<?php rss_enclosure(); ?>
<?php do_action('rss2_item'); ?>
		</item>
	<?php endwhile; ?>
</channel>
</rss>
/* Custom RSS Feed for Home Page headlines */

function create_my_customfeed() {
load_template( get_stylesheet_directory() . '/headlines-customfeed.php');
}
add_action('do_feed_headlines', 'create_my_customfeed', 10, 1);

OU Campus PHP helper file and code asset

To parse an existing external RSS feed, such as a standard or custom WordPress feed, we use a helper file that employs PHP’s simplexml_load_string to load the targeted RSS feed, traverse the XML document, select node content, then echo out a block of styled html for each set of targeted XML nodes. The helper file accepts the $feed variable to specify the location of the targeted feed, and the $maxitems variable to specify the number of items to select from the top of the that feed.

On its own, the PHP helper file can be hit in a browser, to verify the targeted feed type exists and is parsed correctly. The browser will return an un-styled html page from the helper file URL.

The OU Campus Code Asset includes the helper file, over-rides the helper’s default variables, is and is very simple to copy and modify, according to the intended target feed and desired number of items to be presented on the final page. Many different Assets can include a single helper file.

(Vinit from OmniUpdate outlined a much better way to handle the feed URL and number of items in his Server Side Scripting class: via PCF Page Parameters.)

PHP file

  • parses feed and displays HTML
    • variables set to defaults for testing
      • may need to verify feed access and XML structure
  • various types of display html
    • title, excerpt, thumbnail from WordPress
    • title only from WordPress
    • RSS feed with formatting or namespaces different from WordPress
<?php

$input = $_SERVER['QUERY_STRING'];
parse_str($input);

if (!isset($feed))//script or page property will choose which feed to display
	$feed = "http://blog.utc.edu/news/headlines.xml/";

if(!isset($maxitems))
	$maxitems = 3;

$file = file_get_contents($feed);
$file = str_ireplace('src="http://', 'src="//', $file);
$file = str_ireplace('media:content url="http://', 'media:content url="//', $file);
$file = str_ireplace('media:thumbnail url="http://', 'media:thumbnail url="//', $file);

$sxml = simplexml_load_string($file);

$i = 0;
	foreach ($sxml->channel->item as $item) {
		if (++$i > $maxitems) {
				break;
			}
		$namespaces      = $item->getNameSpaces( true );
		$content         = isset($namespaces['content']) ? $item->children( $namespaces['content'] ) : '';
		$content_encoded = isset($content->encoded)      ? $content->encoded                         : '';
		$media           = isset($namespaces['media'])   ? $item->children( $namespaces['media'] )   : '';
		$html       = "<div class=\"row-fluid\">"
					.     "<h3><a href=\"{$item->link}\">{$item->title}</a></h3>"
					.     "{$content_encoded}"
					.  "</div>";
	echo($html);
}
?>
<script>
	$(document).ready(function(){
		$('.sidebar img').removeClass().addClass('thumbnail pull-right span5');
		$('aside.well img').removeClass().addClass('thumbnail pull-right span5');
	});
</script>

OU Campus Assets

  • simple structure
    • specify RSS source
    • number of posts to display
<?php
$feed = "//blog.utc.edu/hr/category/benefits/feed/";
include($_SERVER['DOCUMENT_ROOT']. '/_resources/php/get-headlines-sidebar.php');
$maxitems = 5;
?>

(Vinit from OmniUpdate outlined a much better way to handle the feed URL and number of items in his Server Side Scripting class: via PCF Page Parameters.)

More Speed

After you verify the final production page, you should set up cron jobs for any external feeds that are created via database calls on the external servers. A WordPress feed will require a bit of PHP and MySQL work to generate the RSS feed. This requires some processing, and will introduce a delay in service of the OU Campus PHP page. To avoid this latency, we create cron jobs on the production server to periodically fetch the feeds and cache them locally on the production server. The speed increase for the final page product is noticeable.

If your OU Campus Sites include development, test, training, or mobile Sites, you will want to duplicate the local static feeds on all of the OU Sites.

If you control an external WordPress site, you can employ caching mechanisms or plugins to more efficiently serve RSS feeds.

  • cache RSS feeds via WordPress plugin, e.g. W3  Total Cache
  • cron job to fetch feeds to local production servers
  • minimize external requests for faster page load

This cron job can be placed in /etc/cron.hourly to fetch WordPress feeds, check lastBuildDate, and compare that to the existing build date, and create a cached RSS xml file.

#!/bin/bash

# Preload WordPress Feeds for Website

# UTC News
wget http://blog.utc.edu/news/headlines.xml/ -O /data/web/prod/www/_resources/rss/wp-news.tmp >/dev/null 2>&1
TMP_LASTBUILD="$(xml_grep '/rss/channel/lastBuildDate' --text_only /data/web/prod/www/_resources/rss/wp-news.tmp)"
XML_LASTBUILD="$(xml_grep '/rss/channel/lastBuildDate' --text_only /data/web/prod/www/_resources/rss/wp-news.xml)"
MIMETYPE=`file -b --mime-type /data/web/prod/www/_resources/rss/wp-news.tmp`
if [ "$MIMETYPE" == "application/xml" -a  "$TMP_LASTBUILD" != "$XML_LASTBUILD" ] ; then
    mv /data/web/prod/www/_resources/rss/wp-news.tmp /data/web/prod/www/_resources/rss/wp-news.xml
fi

# Duplicate for development, test & training environments
cp -p /data/web/prod/www/_resources/rss/wp*.xml /data/web/test/www/_resources/rss/
cp -p /data/web/prod/www/_resources/rss/wp*.xml /data/web/test/train/_resources/rss/
cp -p /data/web/prod/www/_resources/rss/wp*.xml /data/web/dev/www/_resources/rss/

For other RSS sources, the structure may be different; maybe we can’t do a lastBuildDate check. And… some servers may not respond to a default wget request from a shell script. No problem: just specify --header and --user-agent, don’t do the lastBuildDate check, and just write the RSS XML file.

#!/bin/bash

# Preload GoMocs Feeds for Website


# Gomocs.com News
wget  --header="Accept: text/html" --user-agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:21.0) Gecko/20100101 Firefox/21.0" http://www.gomocs.com/rss.aspx -O /data/web/prod/www/_resources/gomocs-news.xml

# Duplicate for development, test & training environments
cp -p /data/web/prod/www/_resources/rss/gomocs*.xml /data/web/test/www/_resources/rss/
cp -p /data/web/prod/www/_resources/rss/gomocs*.xml /data/web/test/train/_resources/rss/
cp -p /data/web/prod/www/_resources/rss/gomocs*.xml /data/web/dev/www/_resources/rss/

Event Calendar RSS feeds from Master Calendar via RSS

Example pages that include feeds from Master Calendar. MC has a very rudimentary RSS output of title, date/time and description. MC caches its RSS feeds by default, and expects a lot of traffic to the feed locations, so it is not so necessary to create cron jobs to fetch them. However, if you have an OU Campus page that fetches and displays a large number of MC feeds, you will see a performance increase by using cron jobs. UTC.edu’s home page fetches a number of feeds, and we saw a noticeably faster page load after moving these calls to cron jobs.

Social Media Streams

UTC uses a jQuery plugin and PHP to pull in social media posts for the university account, as well as for individual departments and colleges. This is an easy way to create a social stream sidebar or social wall page.

Facebook, Twitter and Instagram require PHP  API script for connection to signed apps. jquery.imagesloaded is helpful for the wall display

Code Samples

(A compilation zip including all code mentioned in the presentation is available. Comment and subscribe to this post to be notified of updates.)

  • WordPress plugin to add media attachments to RSS feed
  • WordPress functions.php changes
  • PHP files to parse feeds and display HTML
  • OU Campus Assets to specify RSS source and set number of posts
  • cron job to fetch RSS feeds to local server

The post Integrating Content from External Sources into OU Campus Using RSS, PHP, and JavaScript appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/integrating-content-from-external-sources-into-ou-campus-using-rss-php-and-javascript/feed/ 0
TEDx Event Website https://chrisgilligan.com/portfolio/tedx-event-website/ https://chrisgilligan.com/portfolio/tedx-event-website/#respond Wed, 04 Nov 2015 17:13:14 +0000 https://chrisgilligan.com/?p=2342 I’m very excited to help out with Chattanooga’s premier TEDx event: TEDxChattanooga. For this site, I chose a responsive Bootstrap 3 SASS WordPress theme, originally developed for TEDxToronto. I made a few tweaks and improvements to better fit our event, and worked with April Cox from UT Chattanooga to dial in the design and architecture. Developed on […]

The post TEDx Event Website appeared first on Chris Gilligan » new media.

]]>

I’m very excited to help out with Chattanooga’s premier TEDx event: TEDxChattanooga.

TEDxChattanooga website screenshot

TEDxChattanooga website

screen-shot 2015-11-04 at 11.51.25 AMFor this site, I chose a responsive Bootstrap 3 SASS WordPress theme, originally developed for TEDxToronto. I made a few tweaks and improvements to better fit our event, and worked with April Cox from UT Chattanooga to dial in the design and architecture. Developed on an Amazon EC2+Ubuntu+Webmin server running a Nginx+MySQL+PHP-FPM stack, the site should handle plenty of traffic, and can be scaled up to meet spikes in demand coinciding with the event.

Please check out TEDxChattanooga.com!

Love Open Source software, but hate the generic branding? No problem. It’s very simple to create a fully-branded login screen for WordPress and Webmin/Virtualmin, to match a client’s logo and color scheme.


 

screen-shot 2015-11-04 at 11.49.19 AM

screen-shot 2015-11-04 at 12.06.49 PM

screen-shot 2015-11-04 at 12.09.52 PM

The post TEDx Event Website appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/portfolio/tedx-event-website/feed/ 0
WordPress Fail2Ban RegEx for RedHat, CentOS, Amazon Linux https://chrisgilligan.com/consulting/wordpress-wp-fail2ban-regex-redhat-centos-amazon-linux/ https://chrisgilligan.com/consulting/wordpress-wp-fail2ban-regex-redhat-centos-amazon-linux/#respond Thu, 30 May 2013 00:30:24 +0000 https://chrisgilligan.com/?p=1875 VacantServer WordPress sites are getting hammered with bad logins and probes. We’ve implemented a plugin to log failed login attempts to syslog, and a Fail2Ban filter for the same. If you run these on RedHat, you’ll need some additional configuration info… here it is: WordPress login failure regex (error_log): ^%(__prefix_line)sAuthentication failure for .* from <HOST>$ […]

The post WordPress Fail2Ban RegEx for RedHat, CentOS, Amazon Linux appeared first on Chris Gilligan » new media.

]]>
VacantServer WordPress sites are getting hammered with bad logins and probes.

We’ve implemented a plugin to log failed login attempts to syslog, and a Fail2Ban filter for the same. If you run these on RedHat, you’ll need some additional configuration info… here it is:

WordPress login failure regex (error_log):
^%(__prefix_line)sAuthentication failure for .* from <HOST>$

Apache nohome regex (error_log):

[[]client <HOST>[]] File does not exist: .*/~.*

PHP noscript regex (/home/*/logs/error_log,/var/log/httpd/error_log):

[[]client <HOST>[]] (File does not exist|script not found or unable to stat): /\S*(\.php|\.asp|\.exe|\.pl)
[[]client <HOST>[]] script '/\S*(\.php|\.asp|\.exe|\.pl)\S*' not found or unable to stat *$

XMLRPC flood attacks — DDoS and probing (/home/*/logs/access_log):

<HOST>\s.*\s.POST\s/xmlrpc.php*.\s.*

Please also enable the generic apache-nohome, apache-noscript. Install wp fail2ban plugin (and configure it for your server) on your high traffic blogs. These all are helping during the current onslaught, which also includes probing for wp-admin directories, probing for /wp-admin/login.php, plus comment spam.

A new XMLRPC exploit has the script kiddies doing DDoS and probing for vulnerable services, and possibly doing remote code execution on vulnerable services.

Here are some additional resources:

The post WordPress Fail2Ban RegEx for RedHat, CentOS, Amazon Linux appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/wordpress-wp-fail2ban-regex-redhat-centos-amazon-linux/feed/ 0
Fail2Ban Regex for RedHat, Fedora, CentOS and Amazon Linux 2013 https://chrisgilligan.com/consulting/amazon-web-services/fail2ban-regex/ https://chrisgilligan.com/consulting/amazon-web-services/fail2ban-regex/#comments Sat, 20 Apr 2013 01:07:24 +0000 https://chrisgilligan.com/?p=1831 Fail2Ban is a very efficient daemon that scans log files for malicious activity, and offers several options to ban offending IPs and hostnames. Although it is highly configurable, it requires a depth of knowledge beyond that required for GUI-accessible firewalls such as ConfigServer Security & Firewall. Harden the Kernel Before do anything further, it is […]

The post Fail2Ban Regex for RedHat, Fedora, CentOS and Amazon Linux 2013 appeared first on Chris Gilligan » new media.

]]>

Fail2Ban is a very efficient daemon that scans log files for malicious activity, and offers several options to ban offending IPs and hostnames. Although it is highly configurable, it requires a depth of knowledge beyond that required for GUI-accessible firewalls such as ConfigServer Security & Firewall.

Harden the Kernel

Before do anything further, it is necessary to harden the server at the kernel level. Doing so will prevent the majority of attacks that Fail2Ban config will have to deal with. Do this: Linux kernel sysctl hardening. Prior to learning those trix, I had solely relied upon ConfigServer Security and Firewall (CSF) to block malicious attacks. Heading those off at the kernel level is much more efficient.

I’m a huge fan of CSF, having used it with cPanel, Webmin and Virtualmin for many years. Moving to Amazon Web Services presented a challenge, however. I’ve yet to figure out how to use CSF on AWS, so I’ve backtracked to my old standby, Fail2Ban.

“Out of the box,” Fail2Ban for CentOS & other RedHat downstream distros only provides protection against SASL login failures. But with a little work and research, one can successfully configure many more “jails” for the malicious traffic. To take full advantage of Fail2Ban, the admin will need to become familiar with RegEx: regular expressions.

Amazon Web Services Security Groups

AWS provides several levels of firewall protection for EC2 instances; I mainly utilize EC2 security groups and VPC security groups. These firewalls serve to allow traffic to an instance based on IP address, IP block, port, etc. VPC will gateway traffic to and within a virtual cloud; one can create a security group for a cohort of replicated database servers to only allow traffic from within the virtual cloud. Likewise, one could create a cloud group with public facing, load-balanced Varnish reverse proxies that interact with an Apache or NginX app server, and that app server connects to the database cohort. The only publicly-accessible servers would be the web proxies. This is similar to what one might expect in a traditional network operations center.

Learn some RegEx

Learning a bit about RegEx will benefit the admin in virtually every programming endeavor: PHP, Perl, C, SQL etc. all make use of RegEx to match strings of text. Matching a variable string of text in a log file, for example, allows Fail2Ban to act upon that string, sending an email, looking up a hostname, and/or invoking a firewall rule to block traffic from the offending origin for a designated amount of time.

As stated earlier, plain vanilla Fail2Ban regex from EPEL is functional only to block SASL authentication errors. Most of the filters in the package are, at best, examples of what can be done, and will not function as written. Why? Because Fail2Ban has to deal with the particular implementations of log files in various GNU/Linux distributions, and the peculiarities of the reporting structure in those logs, no single RegEx, nor log file location, is going to work for every distribution. Neither can Fail2Ban account for every change in log file location or structure from version to version. What worked for Centos 5.x may not work for CentOS 6.x; what worked in kernel 2.8 may be invalid in kernel 3.4.

Also stated earlier, it is well worth the time to learn a little about RegEx. One vehicle to propel the admin far along the path is RegExr. Author Greg Skinner provides an online version as well as Adobe AIR standalone app. RegExr is an invaluable tool for writing and proofing regular expressions.

Screen Shot 2013-04-19 at 6.32.51 PM

Write the RegEx

OK, here’s the payoff for reading so far: a regex filter that will squelch Dovecot brute force attacks on CentOS 6, Amazon Linux 2013 and other kernel 3.4 RedHat downstream distros such as Fedora and Arch. This would, of course go in /etc/fail2ban/filter.d/dovecot.conf:

Here’s the original, commented out via #:

#failregex = .*(?:pop3-login|dovecot):.*(?:Authentication failure|Aborted login \(auth failed|Aborted login \(tried to use disabled|Disconnected \(auth failed).*rip=(?P<host>\S*),.*

…and here’s a substitute, created via experimentation with RegExr:

failregex = .*auth.*pam.*dovecot.*(?:authentication failure).*rhost=<HOST>.*

How to create a valid RegEx to match the failures to ban? For that, one must monitor log files. In the case of Dovecot on Amazon Linux 2013, that would be /var/log/secure — same as it has ever been with RedHat downstreams. However, the logging is different than it may have been in CentOS 5.8.

Have a look at the log files: use grep to filter the secure log on “authentication failure”

tail -f -n 20000 /var/log/secure | grep "authentication failure"

There will likely be log entries similar to the following:

Apr 19 05:22:19 vm5 auth: pam_unix(dovecot:auth): authentication failure; logname= uid=0 euid=0 tty=dovecot ruser=oracle rhost=80.255.3.104 
Apr 19 05:22:20 vm5 auth: pam_unix(dovecot:auth): authentication failure; logname= uid=0 euid=0 tty=dovecot ruser=sybase rhost=80.255.3.104 
Apr 19 05:22:20 vm5 auth: pam_unix(dovecot:auth): authentication failure; logname= uid=0 euid=0 tty=dovecot ruser=informix rhost=80.255.3.104

RegEx mentioned previously will match those entries, and via Fail2Ban: BAN the offending IPs.

So, you think you’re ready with RegEx fu to tackle the Local Jail (jail.local)? Not so fast. Don’t use regex in jail.local. This file takes wild cards, not regex… so if you want to monitor all users’ /home/user/logs/access_log — you must wildcard /home/*/logs/access_log . NOT /home/.*/logs !

… more to come: follow this post for more details!

The post Fail2Ban Regex for RedHat, Fedora, CentOS and Amazon Linux 2013 appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/amazon-web-services/fail2ban-regex/feed/ 1
Install phpMyAdmin with SSL on CentOS, Amazon Linux, RedHat (Apache or NginX) https://chrisgilligan.com/consulting/install-phpmyadmin-with-ssl-on-centos-amazon-linux-redhat/ https://chrisgilligan.com/consulting/install-phpmyadmin-with-ssl-on-centos-amazon-linux-redhat/#respond Mon, 24 Sep 2012 05:53:52 +0000 https://chrisgilligan.com/?p=1132 How to install phpMyAdmin on CentOS, Amazon Linux, or Redhat. Configuration instructions are provided for Apache and NginX web servers.

The post Install phpMyAdmin with SSL on CentOS, Amazon Linux, RedHat (Apache or NginX) appeared first on Chris Gilligan » new media.

]]>

phpMyAdmin database display

I recently ran into a problem with the upgraded 3.5.2 phpMyAdmin package provided via the rpmforge.repo. Search no longer works, nor does pagination, etc. Plus, it’s out of date and vulnerable to an XSS exploit.

I have solved this by changing to the EPEL repo, which maintains the latest version of phpMyAdmin.

This post will teach you how to install phpMyAdmin on CentOS, Amazon Linux, or Redhat. Configuration instructions are provided for Apache and NginX web servers.

For this to work properly and safely, you should be running SSL on your host. Otherwise, change the ForceSSL line in the config file provided below…

Install phpMyAdmin from EPEL repository

Uninstall current PMA:
yum erase phpMyAdmin

Set up EPEL repo:

Find the latest epel-release at http://download.fedoraproject.org/pub/epel/6/x86_64/
rpm -Uvh http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm

On Amazon Linux, epel-release is already loaded. Edit /etc/yum.repos.d/epel.repo to enabled=1

Edit /etc/yum.repos.d/epel.repo to only include necessary software packages:
includepkgs=phpMyAdmin php-php-gettext

I have removed RPMforge repo due to some recent problems, but if you still need rpm forge repo:
exclude=phpMyAdmin php-php-gettext

Install PMA:
yum install phpMyAdmin

Apache

Edit /etc/httpd/conf.d/phpMyAdmin.conf
  • Allow all incoming hosts for a web hosting server, or only allow hosts you require: localhost, your local workstation, etc.)
  • For a hosting server with open access, you should have a login failure daemon to block offending IP addresses with multiple failed HTTP logins. ConfigServer Firewall works well, and has modules for cPanel and Webmin. OSSEC is another reactive IPTables firewall with Apache login failure IP blacklisting.
  • For Amazon EC2, a good practice is to create a discrete database server, and only allow access to it inside a VPC security group, or from specific IP addresses enabled in that security group. That way, you can safely Allow from ALL hosts, because the VPC firewall will prevent other access. Install phpMyAdmin on a separate web server, and restrict access to PMA’s directory (see below).
# phpMyAdmin - Web based MySQL browser written in php
# 
# Allows only localhost by default
#
# But allowing phpMyAdmin to anyone other than localhost should be considered
# dangerous unless properly secured by SSL

Alias /phpMyAdmin /usr/share/phpMyAdmin
Alias /phpmyadmin /usr/share/phpMyAdmin

<Directory /usr/share/phpMyAdmin/>
   <IfModule mod_authz_core.c>
     # Apache 2.4
     <RequireAny>
       Require ip 127.0.0.1
       Require ip ::1
     </RequireAny>
   </IfModule>
   <IfModule !mod_authz_core.c>
     # Apache 2.2
     Order Deny,Allow
     # comment out Allow from All and add your own static IPs here for security 
     # Allow from All
     Allow from 123.456.7.89
     Allow from 12.345.67.89
     Allow from 127.0.0.1
     Allow from ::1
   </IfModule>
</Directory>

<Directory /usr/share/phpMyAdmin/setup/>
   <IfModule mod_authz_core.c>
     # Apache 2.4
     <RequireAny>
       Require ip 127.0.0.1
       Require ip ::1
     </RequireAny>
   </IfModule>
   <IfModule !mod_authz_core.c>
     # Apache 2.2
     Order Deny,Allow
     Deny from All
     Allow from 127.0.0.1
     Allow from ::1
   </IfModule>
</Directory>

# These directories do not require access over HTTP - taken from the original
# phpMyAdmin upstream tarball
#
<Directory /usr/share/phpMyAdmin/libraries/>
    Order Deny,Allow
    Deny from All
    Allow from None
</Directory>

<Directory /usr/share/phpMyAdmin/setup/lib/>
    Order Deny,Allow
    Deny from All
    Allow from None
</Directory>

<Directory /usr/share/phpMyAdmin/setup/frames/>
    Order Deny,Allow
    Deny from All
    Allow from None
</Directory>

# This configuration prevents mod_security at phpMyAdmin directories from
# filtering SQL etc.  This may break your mod_security implementation.
#
#<IfModule mod_security.c>
#    <Directory /usr/share/phpMyAdmin/>
#        SecRuleInheritance Off
#    </Directory>
#</IfModule>

NginX

Edit nginx.conf for the hostname’s server_name website

       location /phpMyAdmin {
               root /usr/share/;
               index index.php;
               location ~ ^/phpMyAdmin/(.+\.php)$ {
                       try_files $uri =404;
                       root /usr/share/;
                       fastcgi_pass localhost:9002;
                       fastcgi_param HTTPS on;
                       fastcgi_index index.php;
                       fastcgi_param SCRIPT_FILENAME /usr/share$fastcgi_script_name;
                       include /etc/nginx/fastcgi_params;
                       fastcgi_buffer_size 128k;
                       fastcgi_buffers 256 4k;
                       fastcgi_busy_buffers_size 256k;
                       fastcgi_temp_file_write_size 256k;
                       fastcgi_intercept_errors on;
               }
               location ~* ^/phpMyAdmin/(.+\.(jpg|jpeg|gif|css|png|js|ico|html|xml|txt))$ {
                       root /usr/share/;
               }
        }

        location /phpmyadmin {
               rewrite ^/* /phpMyAdmin last;
        }

Edit /etc/phpMyAdmin/config.inc.php

The config below shows some common config options. Important ones are ForceSSL and auth_type. For a production server, SSL should be ON and auth_type http is better; http auth uses MySQL user/pass combinations to restrict access to user-specific databases.

<?php
/* Servers configuration */
$i = 0;

/* Server: MySQL Server [1] */
$i++;
$cfg['Servers'][$i]['verbose'] = 'MySQL Server';
$cfg['Servers'][$i]['host'] = '122.34.567.89';
$cfg['Servers'][$i]['port'] = '3306';
$cfg['Servers'][$i]['socket'] = '';
$cfg['Servers'][$i]['connect_type'] = 'tcp';
$cfg['Servers'][$i]['extension'] = 'mysqli';
$cfg['Servers'][$i]['auth_type'] = 'http';
$cfg['Servers'][$i]['user'] = 'pma';
$cfg['Servers'][$i]['password'] = '';
$cfg['Servers'][$i]['pmadb'] = 'phpmyadmin';
$cfg['Servers'][$i]['controluser'] = 'pma';
$cfg['Servers'][$i]['controlpass'] = 'pmapassword';
$cfg['Servers'][$i]['bookmarktable'] = 'pma_bookmark';
$cfg['Servers'][$i]['relation'] = 'pma_relation';
$cfg['Servers'][$i]['userconfig'] = 'pma_userconfig';
$cfg['Servers'][$i]['table_info'] = 'pma_table_info';
$cfg['Servers'][$i]['column_info'] = 'pma_column_info';
$cfg['Servers'][$i]['history'] = 'pma_history';
$cfg['Servers'][$i]['recent'] = 'pma_recent';
$cfg['Servers'][$i]['table_uiprefs'] = 'pma_table_uiprefs';
$cfg['Servers'][$i]['tracking'] = 'pma_tracking';
$cfg['Servers'][$i]['table_coords'] = 'pma_table_coords';
$cfg['Servers'][$i]['pdf_pages'] = 'pma_pdf_pages';
$cfg['Servers'][$i]['designer_coords'] = 'pma_designer_coords';

/* End of servers configuration */

$cfg['UploadDir'] = '/tmp';
$cfg['SaveDir'] = '/tmp';
/* only if your host supports SSL */
$cfg['ForceSSL'] = true;
$cfg['DefaultLang'] = 'en';
$cfg['ServerDefault'] = 1;
?>

Create the phpmyadmin database for advanced functionality

Look in phpMyAdmin folder /usr/share/phpMyAdmin/examples for create_tables.sql

ssh to server as root user
mysql
-- or if it asks for password --
mysql -u your-mysql-superuser -pyour-superuser-password
mysql > source /usr/share/phpMyAdmin/examples/create_tables.sql

Log in to PMA

Now you can log into PMA with your mysql root user credentials. https://yourhost.tld/phpmyadmin

  • Create a mysql user, pma, with the password you added to the config file, with no default permissions, on localhost
  • Give pma user all permissions on phpmyadmin database, on localhost

Now you have a secure PMA which will work for all mysql users on your host. Version 3.5+ now has Status Monitoring and Advisor. Used in conjunction with a performance tuning script like MySQL Tuning Primer, it will help you fine-tune your MySQL server to your requirements and your environment.

phpMyAdmin MySQL server dashboard

The post Install phpMyAdmin with SSL on CentOS, Amazon Linux, RedHat (Apache or NginX) appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/install-phpmyadmin-with-ssl-on-centos-amazon-linux-redhat/feed/ 0
SSL and CloudFront CDN Support for WebFonts via .htaccess https://chrisgilligan.com/wordpress/add-cloudfront-cdn-support-for-webfonts-via-htaccess/ https://chrisgilligan.com/wordpress/add-cloudfront-cdn-support-for-webfonts-via-htaccess/#comments Wed, 05 Sep 2012 21:33:27 +0000 https://chrisgilligan.com/?p=1005 I recently upgraded my WordPress theme to WooThemes Canvas 5.x, and I found that some of the icons were not rendering, but were showing a letter or integer instead. I dug into the code and found that these icons are now delivered via @font-face webfonts. Meanwhile, I’m working on a client’s e-commerce site with Google […]

The post SSL and CloudFront CDN Support for WebFonts via .htaccess appeared first on Chris Gilligan » new media.

]]>

I recently upgraded my WordPress theme to WooThemes Canvas 5.x, and I found that some of the icons were not rendering, but were showing a letter or integer instead. I dug into the code and found that these icons are now delivered via @font-face webfonts.

Meanwhile, I’m working on a client’s e-commerce site with Google WebFonts and a custom webfont to display the Rupee symbol (Indian currency).

Though the fonts were uploading properly to the CloudFront CDN, and were properly referenced in the minified CSS on the CDN, they were not rendering in Firefox or IE, and the SSL pages on the client’s site were throwing security warnings.

At first I thought this might be a W3 Total Cache issue, because upgrading to the latest development release had solved some other CSS issues. However, it turned out to be a browser security issue.

Evidently this is a security measure to prevent cross-site attacks, but you can allow this via Apache mod_headers, to allow your specific CloudFront (or other) domain, specified in the .htaccess file.

Some sites have suggested using “*” wildcards to allow all domains… but obviously this is a security issue: with this Header, you are granting JavaScript clients basic access to your resources. I recommend you only allow access to the specific CDN domains you require. Do this with a comma separated list, in double quotes.

# BEGIN CDN Cross-Site for Webfonts
<IfModule mod_mime.c>
        AddType font/ttf .ttf
        AddType font/eot .eot
        AddType font/opentype .otf
        AddType font/x-woff .woff
</IfModule>
<FilesMatch "\.(svg|ttf|otf|eot|woff)$">
    <IfModule mod_headers.c>
        Header set Access-Control-Allow-Origin "fonts.googleapis.com,{{yourdistro69}}.cloudfront.net"
    </IfModule>
</FilesMatch>
# END CDN Cross-Site for Webfonts

Yay! Delicious webfonts, even on SSL pages via CDN!

The post SSL and CloudFront CDN Support for WebFonts via .htaccess appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/wordpress/add-cloudfront-cdn-support-for-webfonts-via-htaccess/feed/ 2
Varnish VCL and Config for WordPress with W3 Total Cache https://chrisgilligan.com/consulting/varnish-vcl-and-config-for-wordpress-with-w3-total-cache/ https://chrisgilligan.com/consulting/varnish-vcl-and-config-for-wordpress-with-w3-total-cache/#comments Wed, 15 Aug 2012 00:23:11 +0000 https://chrisgilligan.com/?p=867 I have been working on a Varnish front-end for Apache, to be used with WordPress sites. I described the architecture in Load Balancing Virtualmin WordPress Hosting Server with Varnish on AWS. I now have a configuration that seems to work for all WordPress features, including logged-out commenting. This configuration also works well with W3 Total […]

The post Varnish VCL and Config for WordPress with W3 Total Cache appeared first on Chris Gilligan » new media.

]]>

I have been working on a Varnish front-end for Apache, to be used with WordPress sites. I described the architecture in Load Balancing Virtualmin WordPress Hosting Server with Varnish on AWS. I now have a configuration that seems to work for all WordPress features, including logged-out commenting. This configuration also works well with W3 Total Cache.

This configuration is for Varnish on a separate server, but should also work on a single server with appropriate changes to the port and backend IP settings.

Varnish Config (/etc/sysconfig/varnish)

# Configuration file for varnish
#
# /etc/init.d/varnish expects the variable $DAEMON_OPTS to be set from this
# shell script fragment.
#
#
# Maximum number of open files (for ulimit -n)
NFILES=131072
#
# Locked shared memory (for ulimit -l)
# Default log size is 82MB + header
MEMLOCK=82000
#
# Maximum size of corefile (for ulimit -c). Default in Fedora is 0
# DAEMON_COREFILE_LIMIT="unlimited"
#
# Set this to 1 to make init script reload try to switch vcl without restart.
# To make this work, you need to set the following variables
# explicit: VARNISH_VCL_CONF, VARNISH_ADMIN_LISTEN_ADDRESS,
# VARNISH_ADMIN_LISTEN_PORT, VARNISH_SECRET_FILE
RELOAD_VCL=1
#
## Advanced configuration
#
# # Main configuration file.
VARNISH_VCL_CONF=/etc/varnish/wordpress-varnish3.vcl
#
# # Default address and port to bind to
# # Blank address means all IPv4 and IPv6 interfaces, otherwise specify
# # a host name, an IPv4 dotted quad, or an IPv6 address in brackets.
# VARNISH_LISTEN_ADDRESS=
VARNISH_LISTEN_PORT=80
#
# # Telnet admin interface listen address and port
VARNISH_ADMIN_LISTEN_ADDRESS=127.0.0.1
VARNISH_ADMIN_LISTEN_PORT=6082
#
# # Shared secret file for admin interface
VARNISH_SECRET_FILE=/etc/varnish/secret
#
# # The minimum number of worker threads to start
VARNISH_MIN_THREADS=1
#
# # The Maximum number of worker threads to start
VARNISH_MAX_THREADS=1000
#
# # Idle timeout for worker threads
VARNISH_THREAD_TIMEOUT=120
#
# # Cache file location if using file cache
#VARNISH_STORAGE_FILE=/var/lib/varnish/varnish_storage.bin
#
# # Cache size: in bytes, optionally using k / M / G / T suffix,
# # or in percentage of available disk space using the % suffix.
VARNISH_STORAGE_SIZE=3G
#
# # Backend storage specification
# malloc runs from RAM, file from file
VARNISH_STORAGE="malloc,${VARNISH_STORAGE_SIZE}"
#VARNISH_STORAGE="file,${VARNISH_STORAGE_FILE},${VARNISH_STORAGE_SIZE}"
#
# # Default TTL used when the backend does not specify one
VARNISH_TTL=120
#
# # DAEMON_OPTS is used by the init script. If you add or remove options,
# # be sure you update this section, too.
DAEMON_OPTS="-a ${VARNISH_LISTEN_ADDRESS}:${VARNISH_LISTEN_PORT} \
-f ${VARNISH_VCL_CONF} \
-T ${VARNISH_ADMIN_LISTEN_ADDRESS}:${VARNISH_ADMIN_LISTEN_PORT} \
-t ${VARNISH_TTL} \
-w ${VARNISH_MIN_THREADS},${VARNISH_MAX_THREADS},${VARNISH_THREAD_TIMEOUT} \
-u varnish -g varnish \
-S ${VARNISH_SECRET_FILE} \
-s ${VARNISH_STORAGE}"
#

Varnish VCL (/etc/varnish/wordpress-varnish3.vcl)

backend origin {
.host = "10.11.12.13";
.port = "80";
.connect_timeout = 60s;
.first_byte_timeout = 60s;
.between_bytes_timeout = 60s;
}
#
sub vcl_recv {
# only using one backend
set req.backend = origin;
#
# set standard proxied ip header for getting original remote address
set req.http.X-Forwarded-For = client.ip;
#
# logged in users must always pass
if( req.url ~ "^/wp-(login|admin)" || req.http.Cookie ~ "wordpress_logged_in_" ){
return (pass);
}
# accept purges from w3tc and varnish http purge
if (req.request == "PURGE") {
return (lookup);
}
#
# don't cache search results
if( req.url ~ "\?s=" ){
return (pass);
}
#
# always pass through posted requests and those with basic auth
if ( req.request == "POST" || req.http.Authorization ) {
return (pass);
}
#
# else ok to fetch a cached page
unset req.http.Cookie;
return (lookup);
}
#
# accept purges from w3tc and varnish http purge
sub vcl_hit {
if (req.request == "PURGE") { purge; }
return (deliver);
}
#
# accept purges from w3tc and varnish http purge
sub vcl_miss {
if (req.request == "PURGE") { purge; }
return (fetch);
}
#
sub vcl_fetch {
#
# remove some headers we never want to see
unset beresp.http.Server;
unset beresp.http.X-Powered-By;
#
# only allow cookies to be set if we're in admin area - i.e. commenters stay logged out
if( beresp.http.Set-Cookie && req.url !~ "^/wp-(login|admin)" ){
unset beresp.http.Set-Cookie;
}
#
# don't cache response to posted requests or those with basic auth
if ( req.request == "POST" || req.http.Authorization ) {
return (hit_for_pass);
}
#
# only cache status ok
if ( beresp.status != 200 ) {
return (hit_for_pass);
}
#
# don't cache search results
if( req.url ~ "\?s=" ){
return (hit_for_pass);
}
#
# else ok to cache the response
set beresp.ttl = 24h;
return (deliver);
}
#
sub vcl_deliver {
# add debugging headers, so we can see what's cached
if (obj.hits > 0) {
set resp.http.X-Cache = "HIT";
}
else {
set resp.http.X-Cache = "MISS";
}
# remove some headers added by varnish
unset resp.http.Via;
unset resp.http.X-Varnish;
}
#
sub vcl_hash {
hash_data( req.url );
# altering hash so subdomains are ignored.
# don't do this if you actually run different sites on different subdomains
if ( req.http.host ) {
hash_data( regsub( req.http.host, "^([^\.]+\.)+([a-z]+)$", "\1\2" ) );
} else {
hash_data( server.ip );
}
# ensure separate cache for mobile clients (WPTouch workaround)
if( req.http.User-Agent ~ "(iPod|iPhone|incognito|webmate|dream|CUPCAKE|WebOS|blackberry9\d\d\d)" ){
hash_data("touch");
}
return (hash);
}

The post Varnish VCL and Config for WordPress with W3 Total Cache appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/consulting/varnish-vcl-and-config-for-wordpress-with-w3-total-cache/feed/ 25
Punk Rock Music Community https://chrisgilligan.com/portfolio/punk-rock-music-community/ https://chrisgilligan.com/portfolio/punk-rock-music-community/#respond Tue, 10 Apr 2012 14:20:41 +0000 https://chrisgilligan.com/?p=771 Punktastic.com is a community powered site that covers punk music in Britain, but is poised to go worldwide. They cover punk and hardcore shows and festivals, and provide album reviews, video interviews and more. With a growing audience and user base, Punktastic needed a more reliable and robust web server, so they made the move […]

The post Punk Rock Music Community appeared first on Chris Gilligan » new media.

]]>

Punktastic.com is a community powered site that covers punk music in Britain, but is poised to go worldwide. They cover punk and hardcore shows and festivals, and provide album reviews, video interviews and more.

With a growing audience and user base, Punktastic needed a more reliable and robust web server, so they made the move to a dedicated CentOS 5 series web server with 3GB RAM. While this is a relatively low-end box, it has plenty of horsepower for a single WordPress site and phpBB3 forum.

Punktastic.com web site

For this GIG, I soloed on…

  • transferring the site from a WAMP development server to the live LAMP server
  • configuring an active firewall to block the baddies
  • installing monitoring and administration tools
  • tuning Apache and MySQL for high traffic
  • integrating APC PHP Cache to speed up web pages and provide better concurrency

Loud Fast Rules.

The post Punk Rock Music Community appeared first on Chris Gilligan » new media.

]]>
https://chrisgilligan.com/portfolio/punk-rock-music-community/feed/ 0