Vacation Intelligence! Solving Holiday Challenges with Power BI: Part 1

Not My Usual Blog Post

Despite a career in Business Intelligence approaching twenty years, I have never used data in any meaningful way in my own life, until now: which is why I feel the need to blog about it!

I was planning a trip to Tokyo and wanted to identify the best locations in which to book hotels based on local attractions, with the intention of staying at several places over a fortnight.

However, between Tokyo being huge and the volume of entertainment, it became obvious that some sort of map was needed and the generic tourist maps available online are not personalised to my interests.

Once developed, and the hotels booked, I want to add information about places/activities of interest and fine-tune the holiday.  It is too far to fly to miss anything!

Power BI is the reporting tool chosen for this task due to its easy mapping capabilities and because I happen to be doing some work with it currently.

The Data

The data is limited to the following fields:

Tokyo_Data_Set

…which is all I need at this point to identify ideal hotel locations, especially as I am manually inputting all this data.

The range of data itself is very personal to myself and is unlikely to make a quality list for anyone else.

(A good example of this is how Denny’s, McDonalds and KFC are not in the data as I’m not flying that far East to eat Western fast food! Yet, I have Starbucks listed because I bought a Starbucks mug in Kyoto many years ago and would love one from Tokyo.)

The locations which are in the data set are not all “must sees”, in fact, most are not. The data set is a list of things that may be worth considering with the ‘Favourite’ field used to identify things of particular interest.

The ‘Type’ field was quite challenging due to masses of entertainment in Tokyo being too unigue to be of any particular type!  I decided to use ‘Japan’ as an ‘Other’ to contain things like cosplay Mario Carts or the Ninja Cafe!

All in all, I have over two hundred rows of data from which to work.

Map Steps in Power BI

I have not done anything remotely clever with Power BI mapping. I’ve just used the standard functionality, which is enough for the task at hand.

Here are the key points to creating the map aspects of the dashboard:

  • Configure Latitude and Longitude columns.
  • Insert a Map.

Map_Icon

  • Drag and Drop co-ordinates to their respective fields.

Map_Values

  • Add a Slicer to filter and Table to show the details of a specific map item to finish off the dashboard.

Tokyo_Full_Dashboard

The Result 

At this stage, the report is just a map with a table showing the name and opening hours. Currently this is all I need.

The full view map above shows centralised clusters of attractions and a few outliers which will need evaluating as to whether they are worth the effort to see.  (I expect the Ghibli Museum to justify any amount of travel!)

The Dashboard 

The following screen prints from the report show the two main clustered areas that are the prime candidates for accommodation, and an outlier that will require a dedicated trip.

Map 1: Cluster of Tourist Spots 

Tokyo_Area_One

Map 2: Another Cluster of Tourist Spots 

Tokyo_Area_Two

Map 3: Outlier: Ghibli Museum 

Tokyo_Ghibli

With these clusters of attractions it was a breeze to identify the ideal hotel locations and book accommodation accordingly.

A few other insights emerged, such as a lot of novelty cafes being clustered together, making the likelihood of visiting many of them unlikely, especially when there is an endless list of ‘proper’ restaurants to be visited!

Further Work…

With months to go before the trip, and the hotels intelligently booked, it is now a case of enriching the existing data set over time.

I need to identify which activities require booking in advance and some will doubtless not be of interest once I investigate further.  For example, the aforementioned cosplay Mario Carts, looked fun at first glance, but after reading how they annoy locals, I decided against it.

A Cry for Help!!! 

Currently my knowledge of ‘cool stuff’ in Tokyo is (very) limited by what I can find on Google.  With that in mind, anyone out there with any suggested activities and/or eateries in Tokyo, please leave a comment on this blog post!

Advertisements
Posted in Power BI | Tagged , , , | Leave a comment

Snapshot Reporting the Lingering Threat to GDPR Compliance

Introduction

The implementation of General Data Protection Regulation (GDPR) has caused the business community to reconsider how they collect, hold and distribute personal data. The GDPR is going to impact every business operating in the EU and with the effective date of May 25, 2018 fast approaching, cyber security professionals are working tirelessly to amend their practices in order to comply with the new legislation.

The GDPR provides a number of rights for individuals whose data you may be handling; these include the right to be informed; the right of access; the right to rectification; the right to restrict processing; the right to data portability; the right to object; rights in relation to automated decision making and profiling.

On top of this the right to erasure means that the data subject has the right to request erasure of all personal data related to them on any one of a number of grounds.

With all the new compliance methods being introduced, one thing seems to slip through the cracks – the fact that it is not uncommon for data to be moved around an organisation outside of official controls, even when established data channels are available.

This paper will look at the risk this rogue data poses to the GDPR compliance and how it can be aligned to the GDPR as well as the real, day to day business benefits that can make this alignment rework cost-effective and justifiable regardless of the GDPR.

What is Snapshot Reporting? 

The ‘snapshot’ is often used in reporting terminology to describe a data extract for a specific moment in time. This does not automatically mean the snapshot is a GDPR liability…but it doesn’t mean it isn’t either!

Why Snapshot Reporting Happens 

In many cases, changes to the SQL script used to export the data is all that is required to turn a snapshot into a repeatable report.  And once that logic is in place, the extract can be refreshed when required for the full data set, rather than accumulating a collection of historic extracts to build the same set.

snapshots

Common situations that can make this challenging include:

  • It’s not SQL!

Most off-the-shelf software is packaged with some form of built-in reporting.  This may take the form of a report collection that meets the more basic queries the vendor thinks are important, or contain a method of report configuration that allows bespoke reporting… but usually with a subset of functionality to allow basic report development.

The functionality of these built-in reporting utilities varies widely from product to product.  Some do cater for the extended logic required for non-snapshot reporting, some do not.

  • Not Enough Time Stamps

The key to reusable extract logic is being able to consistently identify when something has happened/changed in the source data.  When timestamps do not exist for certain events it becomes impossible to know when or what has changed.

  • Available Specialist Reporting Software

Not having access to specialised reporting software with which to build dynamic reports is more common than many would suspect.  Sure, your organisation has an embedded reporting service but it takes forever going through the official channels and the data warehouse doesn’t have quite what you need anyway.

In this scenario, a snapshot direct from the source data looks like it will solve the problem nicely.  And it will, until the source data is updated and the extracts are out of alignment.

Ironically, this snapshot data invariably finds itself being sliced and diced in MS Excel, a product that is perfectly capable of direct data source access and applying the logical approach needed to avoid snapshot reporting.

  • In-house Expertise

Every instance of snapshot reporting I have encountered has been a ‘best endeavours’ solution created by well-meaning employees trying to fulfil a business need that would otherwise be impossible.  And it is not uncommon for key business activities to be dependent on said snapshots. (It needs pointing out however that for an experienced SQL user, this sort of change is a logic issue that is well within their skillset to resolve.)

If any variation of the above scenario exists within your organisation there is a good chance the accuracy of current reporting is compromised regardless of the GDPR.  Fortunately, none are necessarily showstoppers in the handling of snapshot reports but can change the focus of the problem from being purely logical to encompassing technical challenges, which we’ll look at in the next section.

The Risks

Non-repeatable snapshots may have a minimal GDPR risk if they are deleted after use, but are still a challenge to Data Governance and auditing.

It is worth considering the risk snapshot reporting brings to the organisation, before we look at the GDPR implications:

  • Manually Intensive

Snapshots don’t always mean that manual activities are required to derive value, but it is usually the case as work normally done within a data warehouse still needs to be applied.  This can range from summary operations to merging multiple snapshots together for historic trends.

  • Single Point of Failure

It is bad enough when an employee with extensive, specialist knowledge leaves the organisation, but when their private stash of snapshot extracts vanishes with them the risk can be exasperated.  And in the likely event that the snapshot extracts have had undocumented manual transformations, that knowledge is lost too.

  • Outside of Data Governance

If the logic used to create a data extract produces different results today than it did last Thursday, the extract recipient is likely to collect historic snapshots and build an unofficial data repository that is unknown to the Data Protection Officer (DPO).

The wider organisation and its associated data controls cannot know what snapshots have been collected for later use or what additional transformations have been applied.  This situation creates a risk for audit trails, data lineage and ‘one version of the truth’ as well as GDPR compliance.

The GDPR Risk

The below illustration shows how personal data can be extracted out of Data Governance and the GDPR compliance through a snapshot and resurface months later.

gdpr-snapshot-risk

What are the GDPR Implications?

Not honouring a request to be forgotten, or using personal data for other than its permissible purpose opens the organisation to the  GDPR risk.

There are two levels of financial penalties related to GDPR:

  • Up to €10 million or 2% of the organisation’s global annual turnover from the previous financial year (whichever is higher).
  • Up to €20 million or 4% of the organisation’s global annual turnover from the previous financial year (whichever is higher).

DPO Responsibilities

The expectation under the GDPR is that the Data Protection Officer (DPO) will have the organisation’s data repositories documented and know exactly where to locate any instances of personal information.

With snapshot extracts being shared outside of any formal controls on distribution and content, the DPO will not have a view of these repositories which can hold data that has been removed from the rest of the organisation.

The GDPR may exist because of deliberate exploitation of personal data and shoddy security practices, but this does not make responsible organisations immune to getting caught in the crossfire.

What is the Solution?

The rest of this section outlines two possible approaches to handling the risk of snapshot reporting, their suitability will depend on current reporting practices and available resources.

When a Snapshot is not a Snapshot (Logical Approaches)

In a scenario where the only way to get data for the previous month is to run a snapshot extract on the first day of the current month: the recipient will collect those extracts in order to build quarterly and yearly summaries.

Most high GDPR risk snapshots are the result of this approach to report building and they are easily addressed once identified.

In this specific example, the recipient requires two changes to their current practices:

  1. The option to extract the data for any full month at any time and receive the same result.
  2. The option to extract the data for any previous quarter (or year) at any time and receive the same result, as well as aligning to the results of the aggregated monthly extracts for the same period.

These two changes and a policy of any reports being refreshed with current data before distribution will go a long way to minimise the GDPR risk without putting all data extracts into existing Data Governance.

Extended Enterprise Data Governance

The ideal solution for handling snapshot reporting is to shut it down completely and provision all data through a governed service.

This is likely to be expensive, but still cheaper than the possible GDPR fines!

Unfortunately, it is also likely to be time consuming, so applying the logical approach outlined above as an immediate safeguard is highly recommended.

The specifics on how this can be implemented will vary dramatically from organisation to organisation and is beyond the scope of this paper.

What to Do Next

The following bullet points outline a list of investigatory activities that should be undertaken if there is a suspicion of snapshot reporting within an organisation.  The amount of effort to carry out the below suggestions depends greatly on what data governance is already in place and the volume of reporting in use.

  • Review Known Reports/Extracts

The people using snapshot extracts within an organisation are doing so to meet a business need.  If that business need can be fulfilled by a more stable, less manually intensive activity it should be welcomed.

  • Review Unknown Reports/Extracts

Finding snapshot are currently in use which are not known to the organisation is its own challenge.  The ease of auditing what data is being extracted and how will vary depending on the technical approach used: and may be as simple as refreshing a canned audit report to a manually surveying activities.

  • Fix

Whether fixing is bringing a snapshot under strict Data Governance through an agreed method, such as being provisioned through a data warehouse, or simply improving the snapshot extract logic.  The fix is a success if there is no longer a need to store data extracts outside of Data Governance.

  • Remove and/or Refresh

Once troublesome snapshots are identified, and new data provisions are in place: any existing snapshot extracts should be deleted with the user of those extracts confident that the data will be available when they need it (i.e.: the user has a process they can invoke to get the data they need from the governed data service).

  • Amend Existing Data Governance Processes

With all the above done, it is important to improve or introduce official processes that stop the proliferation of snapshot reporting reappearing over time.

Regular audits to ensure Data Governance is maintained should be considered good practice in general, and are essential for continued data health.

The variation between organisations and technologies make detailed technical or procedural recommendations impossible.  Sooo..

Get in Touch!

One of the biggest challenges in addressing snapshot reporting is having the available resource with the required skillset to carry out the work, work that requires a mix of analysis and in-depth knowledge of data handling (including SQL).

In-house talent may be lacking or, more likely, too busy with their day job for extra work and hiring external expertise for a two-week piece of work can be difficult and expensive.

Get in touch to see how I can help out, or just to chat around the issues raised in this paper: 

Jasondove@hotmail.com

About the Author

I have two decades in the IT industry, consulting for multiple companies and public sector organisations, always with a focus on data in one way or another.  I’ve done everything from greenfield data warehouse implementation to statistic analytics, and everything in between.

I am always more than happy to undertake any work that leads to the demise of snapshot reporting!

Posted in Business Intelligence | Tagged , , , , ,

How to Chart Open Incidents in Power BI for ServiceNow or Any ITSM Using DAX

I have written at length about the logical approach required to accurately measure how many Incidents (Incident or otherwise) are open during a reporting period without resorting to hideous ‘snap-shot’ reporting.

My other writing has always been SQL-centric, but I recently faced this challenge with Power BI: which required a very different approach, so I thought I’d share!

The Issue

The data associated with an Incident contains an Open Date and a Resolved Date fields (I’m ignoring Closed Dates, but the logic is the same), if these dates span across two or more months there is no field(s) to capture these point’s in time from which to chart from.

Reporting on what isn’t there is always tricky!

The first choice for this sort of calculation is to push it to the data warehouse to calculate (which is where my earlier writing will help!) but Power BI is everywhere and often used directly from a relational data source or an extracted data set.  In many cases, this extra data will have to be generated within Power BI.

The Solution

Using DAX (Power BI’s built in scripting language) we are going to create a new table derived from a cut down mock-up of an ITIL/Incident extract that could easily be created using ServiceNow’s built in reporting capabilities or that of most other ITSM tools out there.

The bad news is that Power BI DAX is limited in its methods of creating new data, as opposed to summarising or reformatting existing data.  To be fair, this should be expected for reporting software!

To keep the focus on the matter at hand, the data set has been kept to the absolute minimum.  In the real world, the data set would be far more comprehensive.

Step 1

First, we need to know how many months (as most ITIL reporting is monthly) an Incident is open before we consider creating the new table.  With the DATEDIFF function, this is straight forward until we get to Incidents that are currently open and have no Resolved Date value.

We just compare the Open Date to the Resolved Date using monthly intervals to create a Calculated Field.

For the context of showing a historic trend of open Incidents, an Incident being Resolved after the reporting period is the same thing as it being open during said timeframe.  With this in mind, we shall treat the ResolvedDate as the current date for the sake of this calculation, like so:

MonthsCount =  

IF(Sheet1[ResolvedDate] = BLANK(),  

DATEDIFF(Sheet1[OpenDate], TODAY(),MONTH) ,  

DATEDIFF(Sheet1[OpenDate], Sheet1[ResolvedDate],MONTH)) 

A couple of points to note:

  • An Incident that was opened and resolved in the same month will return zero, which is just what we want.
  • By treating an unresolved Incident as being resolved on the current date will show the Incident as being open from its inception until the end of the reporting period (I.e.: the last twelve full months).

Step 2

With this newly created MonthsCount field, we can generate the Calculated Table upon which to populate charts.

To achieve this, we will use the following functions:

  • GENERATE
  • FILTER
  • SELECTCOLUMNS
  • GENERATESERIES
  • EDATE

Altogether, it looks like this:

OpenTicketDates = GENERATE(  

    Sheet1, 

    FILTER( 

        SELECTCOLUMNS(   

            GENERATESERIES(0,200), 

            “Value”, [Value], “ChartMonth”, EDATE(Sheet1[OpenDate], [Value]) 

        ), 

    [Value] <= Sheet1[MonthsCount]) 

)  

The above code creates a new table and inserts as many rows as identified by the “MonthsCount”, while using EDATE to increment the month for each row.

Note:

  • DATEADD will not return a result that isn’t in the current date range…it is far easier to use EDATE!

Below is the resulting table which now contains a row for each month it is open:

DAX_Code_TableFig. 1: The dynamically generated trend table.

Step 3

Use the Calculated Table to populate the required chart(s).

Below is my end result, presented to show the dynamic trend data, rather than win any design awards:

TrendChart2018Fig. 2: The trend data showing the results for the full months in the current year. 

TrendChart2017Fig. 3: The same charts filtered for the full year. 

And that is it!

Summary

This solution is simple enough and with only a few gotchas along the way!  The most difficult thing is the switch in thinking away from an SQL mindset and into the wonderful world of Power BI’s Excel-esque approach to data management.  Personally, I love it!  But I’ve been using Excel as a supporting BI tool my entire career and most of Power BI/DAX makes perfect sense to me…and then there is DATEADD!

Of course, the real beauty of this solution is that it is built into the report and works from the original data, thus removing any manual interaction, reliance on a data warehouse or snap-shot reporting.

 

 

 

 

 

 

 

 

 

Posted in Business Intelligence, ITIL, Power BI, ServiceNow | Leave a comment

Making the Most of That Last Interview Question (Guest Post)

So much of the interview advice bandied about on the internet is about damage limitation. Advice like “Don’t be late,” or “Dress smartly” is great if you want to avoid sabotaging your interview but if you want to get ahead of the crowd advice gets thin on the ground.

This article is my humble attempt to share a true value-add interview tip, so with no more fluff or fuss…

Just the Tip

The interview is virtually over and it probably feels like the job is won or lost already, there is just that pointless final question left:

“Do you have any questions for us?”

The tip is to answer this with:

“Is there anything I can do between now and starting the job to make me even better suited?”

That line puts the interviewer in a position where they have to verbally say you are perfect for the job (which is no small thing), or name some specific shortcoming and give you another chance at convincing them of your abilities or experience.

Of course, if you are genuinely weak in the named area this is not going to help much. But often it is usually a case of accidental omission on the part of the interviewee. It is all too easy to not give enough coverage on a particular subject which the interviewer may be interested in, especially if the job requires numerous skills or experience.

My Personal Experience

I would never advise anything unless I had used it successfully myself.

And this tip is gold! Since being told about this by a fellow freelancer a few years ago, I have used this approach around ten-fifteen times (as a freelancer I tend to interview every 6 to 9 months) and it has been well received every time.

The usual response is a nod and wry smile from the interviewer(s) as they acknowledge the brilliance of this question. I have even an interviewers compliment me on it and admit they will use it themselves at their next interview.

Does it guarantee you’ll get the job?

No, but nothing will.

There are always factors outside our control. But personally, I can directly credit it with me securing at least three jobs when concerns were raised around my abilities or experience and I was able to address these fears.

Obviously, making this ‘tip’ public potentially weakens it as a strategy for myself, so I’ll probably have to return to asking if it is okay to use the disabled parking if I’m hungover!

About the Author

Dean Paumme is an active freelancer in the Business/IT arena and author of “The Secret Route to Riches: How to make millions with your current career”, a no nonsense, tip laden guide to making the most of your skillset.

Get his book here (amazon.com):

Or here (amazon.co.uk):

The Secret Route to Riches by Dean Paumme

The Secret Route to Riches by Dean Paumme

 

Posted in Career, Uncategorized | Tagged , , , , , | Leave a comment

Helpful ServiceNow Reporting Articles

I thought putting together a contents list of the ServiceNow reporting blogs I’ve written over the last couple of years for easy reference would be helpful.

So, here it is!
ServiceNow Reporting and Why Direct ODBC Reporting is the Only option

A few Random Tips for ServiceNow Reporting

Reporting Software and ServiceNow

How to Overcome the Top 5 Challenges for ServiceNow  Reporting

Example ServiceNow Report

This post is more around generic ITIL reporting, but may still be interesting:
The Top Three ITIL Reporting Pitfalls (pt 1)

…and this last one has nothing to do with ServiceNow reporting, but I’m including it anyway as someone typically asks:
How to Become an IT Consultant for Fame and Profit

Posted in ITIL, ServiceNow, Uncategorized | Tagged , , | 1 Comment

How to Become an IT Consultant for Fame and Profit

I seem to get more emails asking about how to become a self-employed consultant than I do about any other thing I actually blog about.

I’ve not written that much recently so thought I would put up a quick post with some advice around this subject.

So, my top piece of advice (and only piece of advice!) is to buy this book:

The Secret Route to Riches by Dean Paumme

The Secret Route to Riches by Dean Paumme

(Please note, that is not a paid link, just a graphic I’m not getting paid for!)

The above book costs virtually nothing as a kindle download and covers all the things you need to know to get started as a consultant, whether this is as a Business Intelligence expert or any other IT discipline.

I received this book as a gift, about a decade too late for me, but it would have been a life saver (ok, money saver!) back when I was starting out. It’s packed with practical advice on everything from finding contracts to the correct mind set to make the most of it.

Posted in Business Intelligence, CodeProject | Tagged , , , | Leave a comment

How to Overcome the Top 5 Challenges for ServiceNow Reporting

The Stuff I Don’t Need to Tell You

I am pretty sure that if you’re reading this you already know why good reporting is crucial for successful ITIL and why control beyond the Report Module in ServiceNow is a necessity.

If not, look here.

And as if that isn’t grim enough, nearly two thirds of all Reporting/Business Intelligence projects fail or, at the least, don’t succeed fully.

Your Top 5 Enemies

I have worked on numerous Business Intelligence projects of all shapes and sizes and the same few issues arise across all types of reporting and all business sectors.

1.  Learning the Database

Which data lives in which tables and the different permutations on how they can all link together depending on the business need.

Figuring out all this from scratch is difficult. So difficult I wrote this post to try and help you out.

 2.  Organisational Thirst for Information

Businesses need information, and having to wait for a lengthy BI Project to complete can take months.  This can lead to end user attempts at report development in Excel and Access…which can be great, but often isn’t!

3.  Budget

A full BI Project is expensive and requires a range of skilled individuals to bring it to fruition before you even look at software licenses and hardware.

4.  Skills Shortages

Good BI people are constantly busy.

Finding the right person with the right software knowledge and business sector can be a huge challenge in its own right and ITIL is one of the more complex subjects to report on.

5.  Specialist Software

Reporting software can get very expensive, but even when it doesn’t, it can take a long time to buy through company purchasing procedures making the wait for information last even longer.

Help Me Help You

If you are worried about any or all of the above…you should be!   And these problems are generic, ITIL and ServiceNow come with their own challenges.

Let me help.

I know the data(1) having developed ServiceNow reports for a global market since 2010 (and ITIL reporting since 2004) and have a starter set of 20+ top quality reports that can be used almost instantly(2) for a low price(3) developed by myself(4) in Crystal Reports(5ish!).

Another example of the clear and concise ServiceNow reports available.

An example of the clear and concise ServiceNow reports available.

This pack contains a library of ODBC driven, clear and meaningful, ServiceNow reports which span Incident, Problem and Change management. Covering SLA/OLA/KPI measurement as well as volume and frequency for a holistic view of the service management slice of your ITIL implementation.

If required, these reports can be branded with your company colour(s) and logo at no extra charge.

An example of the clear and concise ServiceNow reports available.

Another example of the clear and concise ServiceNow reports available.

Included in the price is one day of offsite consultancy to aid any reconfiguration needed to match your company’s unique setup.

On-going Support

Crystal Reports is a common skillset, and with this starter pack as the foundation, most Crystal Reports developers should be able to pick it up and expand the suite.

OR

At the least, simply use the SQL and table structures from Crystal Reports as a massive head start in both analysis and development in whatever reporting software you prefer.

OR

Do both! Use the Crystal Reports to satisfy the need for data almost instantly while a more lengthy solution is developed.

I am available for future support, amendments or additional development in Crystal Reports, Qlikview, SSRS, BO/WebI and even Excel/Access (which are comfortably able of meeting BI requirements in the right hands).

I provide offsite support starting at half a day.

 

Another example of the clear and concise ServiceNow reports available.

Another example of the clear and concise ServiceNow reports available.

Annual Fees Can Kiss My SAAS

So, the actual cost: FREE with only one day of reasonably priced offsite consultancy!

This is a one off fee and support can be hired as and when you need it.  This may seem an odd approach, but I’ve chosen this route for three reasons:

1.  One day (eight hours) should be enough support to get you up and running with the reports.

2.  It makes it very easy to see what a great deal it is, as it would probably cost 2-5 days of consultancy per report.

3.  It is usually easier to get sign-off for a consultant than to purchase a product.

While I would like to sell as many of these packs as possible, I only have so much bandwidth for support and further development and want to ensure I provide a top class service.

To this end, only a limited number of these packs will be released at any one time.

Act Now!

If you want to discuss this offering in more depth, such as the specific reports that make up this library (or anything else around ServiceNow reporting for that matter!), please email me, I love this stuff and am happy to chat about it:

JasonDove@Hotmail.com

PS: Check out my various ServiceNow blog posts which will be of great help to any organisation looking at professional reporting.

PPS: Of course, I am also available for any bespoke reporting available in a wide variety of software (namely: Crystal Reports, Qlikview, SSRS, BO/WebI, even Excel/Access).

Posted in Business Intelligence, ITIL, ServiceNow | Tagged , , | 1 Comment