Additional Documentation for Custom Policy Hooks

pilotjohn4
Level 2
Additional Documentation for Custom Policy Hooks

Hi there,

I am trying to find a couple ways to extend the GDPR plugin and restrict bundle deployment or running jobs if a dataset contains PII on certain projects. I was looking at the Custom Policy Hooks but can't find much in terms of documentation. Is there any additional help I can get to program/develop using Custom Policy Hooks?

0 Kudos
2 Replies
matthieu
Level 3

Hi @pilotjohn4 ,

As your post is quite old now, have you found a way to extend the GDPR plugin to restrict running jobs on datasets containing PII on certain projects ?

A way to do it could be to implement the restriction in the handleRecipeInputsOutputsfunction of the GDPR plugin if one of the dataset attached to the recipe contain PII (i can only find a way to deny creating a new recipe with a source dataset containing PII as the only hook which seems suitable is the  onPreObjectSave hook=> this is not denying to run a previously created recipe)

If the input dataset contains PII (!"NO".equals(sd.customFields.get("gdpr_contains_personal_data").getAsString() or "YES".equals(sd.customFields.get("gdpr_contains_personal_data").getAsString() depending on how strict you wan't to be), you could then test the input dataset projectKey or even access the input dataset project object with the code below :

String projectKey = sd.projectKey;
SerializedProject sp = projectsService.getMandatoryUnsafe(sd.projectKey);

The DSS Java Policy Hooks plugin can be compiled and packaged using Apache ANT. The ANT build.xml is automatically created at the root of your plugin when you add a new custom java policy hook to the dev plugin. Here is the file needed to rebuild  the GDPR  plugin :

<project name="gdpr-hooks" default="jar">
    <property name="build.dir" value="java-build" />
    <property name="dist.dir" value="java-lib" />
    <property environment="env"/>

    <target name="clean">
        <delete dir="${dist.dir}" />
        <delete dir="${build.dir}" />
    </target>

    <target name="jar">
        <path id="lib.path.id">
            <fileset dir="${env.DKUINSTALLDIR}/lib/ivy/backend-run" />
            <fileset dir="${env.DKUINSTALLDIR}/lib/ivy/common-run" />
            <fileset dir="${env.DKUINSTALLDIR}/lib/shadelib" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-core.jar" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-app-platform.jar" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-dss-core.jar" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-scoring.jar" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-dip.jar" />
            <fileset file="${env.DKUINSTALLDIR}/dist/dataiku-dss-core.jar" />
        </path>
        <mkdir dir="${build.dir}" />
        <mkdir dir="${dist.dir}" />
        <javac debug="true" destdir="${build.dir}" classpathref="lib.path.id" encoding="utf-8" includeantruntime="false">
            <compilerarg value="-Xlint:all" />
            <src>
                <pathelement location="java-policy-hooks/gdpr-hooks" />
            </src>
        </javac>
        <jar destfile="${dist.dir}/dss-plugin-test-java-hooks.jar">
            <fileset dir="${build.dir}" />
            <fileset dir="java-policy-hooks/gdpr-hooks/src/">
                <include name="**/*.yaml"/>
            </fileset>
        </jar>
    </target>
</project>

Don't forget to add the DKUINSTALLDIR (your DSS install directory)  to your path before running ant :

export DKUINSTALLDIR=/home/dataiku/dataiku-dss-10.0.7
ant

 

Cheers,

Matthieu

 

CoreyS
Dataiker Alumni

It is never too late to share a solution so thank you for sharing yours with the Community @matthieu

Looking for more resources to help you use Dataiku effectively and upskill your knowledge? Check out these great resources: Dataiku Academy | Documentation | Knowledge Base

A reply answered your question? Mark as โ€˜Accepted Solutionโ€™ to help others like you!
0 Kudos