Skip to main content

https://beisdigital.blog.gov.uk/2013/05/01/science-and-society-review/

Science and Society review

Posted by: , Posted on: - Categories: Case study, Engagement, Strategy

In Autumn 2012, the BIS Science and Society team started a review of our programme and objectives.

As a team, we spend around £13 million a year, part of an allocation set out in the Science and Research Budget, on a number of activities centred around increasing engagement with science and research.

The review was prompted by 2 things:

  • a requirement from the Cabinet Office to review the small number of activities in the programme which fell within the remit of the ERG marketing controls
  • a need to review the evidence for the programme in preparation for the next spending review

A decision was made to combine the 2 into a whole programme review to ensure the communications activities were assessed in context. In the review, we wanted to establish a clear vision and develop a new set of objectives/priorities in collaboration with the science and society community in the public, private and third sectors.

Goals

Referring back to the Cabinet Office/Demsoc key goals for online policy making, this project had 4 main objectives, namely to:

  • clarify objectives
  • generate ideas
  • test out new ideas
  • create and design a service with relevant stakeholders 

How we went about it

We already had a WordPress-based microsite, which has been used (in gradually revised forms) since a 2008 consultation A Vision for Science and SocietyThat consultation was one of the first in government to be delivered in a commentable, online format.

It, in turn, led to the establishment of 5 externally-led expert groups – each tasked with developing an Action Plan on one of five key science and society themes.  Each of those groups was active from mid 2009 and published their plans in early 2010.   During that time, the microsite was used to enable comments on the work of the groups, to host publicly available minutes, to host completed action plans, and (in the case of some groups) report on progress towards those actions in 2011.  During this 2010 to 2012 period, team members looked after the site themselves, with support from the BIS digital team.

As the structure and format of the site was well-established, it was decided, with the assistance of that digital team, to develop it further and use it as part of our 2012/2013 review process, to build on the existing stakeholder relationships generated through the site.  The online part of the review had a number of stages:

Stakeholders from the science and society community were active sharers of our content and requests for views, with a theoretical Twitter reach of more than 250,000 accounts achieved for the review.  However, that did not translate into direct comments on the site or the vision, aims and objectives.

The digital element was only part of the whole process.  We supplemented these activities through face to face contact with stakeholders, and later the digital element was replaced with more detailed face to face workshops with those stakeholders.

What worked well

Working with the digital team and making better use of complementary digital channels (micro-site, Twitter, other online networks) led to the potential to reach new audiences and gain insights beyond the usual suspects.

What worked less well

Despite the high reach referred to above, and general online conversation around the review there was a very low conversion rate to actual comments.

Our limited team resources meant that – this time round - there was less scope for responding to comments in real time, resulting in a scaling back in the ambition of digital engagement activity. 

We were targeting stakeholders as part of an integrated engagement package, including face to face meetings and workshops and many of those stakeholders have greater familiarity and comfort with responding in non-digital ways. 

How the policy was shaped (impact on policy)

The online comments (both on our microsite and elsewhere) had a direct impact on how our new vision was finalised and the aims and criteria for BIS support were shaped – there was some sharp online criticism which led the team to reframe all of these. For instance, it was clear that the objectives originally proposed were not smart enough and what was actually being defined were broader aims.

Ongoing impacts on the policy team 

Working with the digital team has broadened our team’s appreciation of using an integrated suite of digital channels and the benefit of BIS’s increasing presence on a number of digital platforms to reach new audiences.

As a team – since that first consultation in 2008 – we have seen both the benefits and limitations of digital approaches.

New team members have been trained to update the micro-site which will allow continued engagement with the stakeholder community as the Science and Society programme is finalised. We envisage working with the digital team to use their channels in ongoing communication of the outcomes of the review.

Digital Team Takeaway

  • Reach and general interest are no guarantee of actual engagement.
  • An established online community is less likely to comment on a blog or microsite, but to carry on conversations with existing networks – eg on Twitter

Stay up-to-date by signing up for email alerts from this blog.

Sharing and comments

Share this page

1 comment

  1. Comment by Antony posted on

    Re: "How the policy was shaped (impact on policy)", this is one of the big unknowns as far as people engaging with Whitehall are concerned.

    One of the things government departments - and large organisations in general - need to ask themselves is this: "To what extent do we want to be influenced by the feedback we receive through social media?"

    This means a number of things:

    1) How well are your policy advisers engaging with the public as well as key stakeholders through social media? (eg does it have to be done through a conduit - i.e. a digital media team, or have you trained and empowered policy officials to do so directly?)

    2) What methods are you using to analyse your social media activity? How does this compare to other departments? What input has the Government Digital Service had in advising you? Are you looking for a consistent approach across Whitehall and the public sector beyond?

    3) What are the systems and processes that you have which ensure such analysis mentioned in 2) is fed into your decision making processes? You can have the greatest set up in 2) but if it's not fed back into the key departmental boards (corporate, policy, programme and project boards) AND has a visible impact on policy, then it is as good as worthless.

    4) How can you evidence both to specialist stakeholders interested in the policy detail, to Parliament and to the general public that you have used social media constructively to improve the policies that ministers have set out? For example stating that you wanted to choose option A rather than B, but feedback giving reasons X, Y & Z made it clear that option B was better than A, hence going with B.

    Finally...

    To what extent are your social media activities complementing (adding to and improving) your existing outreach activities done through traditional channels?

    Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.